764 resultados para Computer frameworks


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work presents the implementation and comparison of three different techniques of three-dimensional computer vision as follows: • Stereo vision - correlation between two 2D images • Sensorial fusion - use of different sensors: camera 2D + ultrasound sensor (1D); • Structured light The computer vision techniques herein presented took into consideration the following characteristics: • Computational effort ( elapsed time for obtain the 3D information); • Influence of environmental conditions (noise due to a non uniform lighting, overlighting and shades); • The cost of the infrastructure for each technique; • Analysis of uncertainties, precision and accuracy. The option of using the Matlab software, version 5.1, for algorithm implementation of the three techniques was due to the simplicity of their commands, programming and debugging. Besides, this software is well known and used by the academic community, allowing the results of this work to be obtained and verified. Examples of three-dimensional vision applied to robotic assembling tasks ("pick-and-place") are presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of water-sensitive papers is an important tool for assessing the quality of pesticide application on crops, but manual analysis is laborious and time-consuming. Thus, this study aimed to evaluate and compare the results obtained from four software programs for spray droplet analysis in different scanned images of water-sensitive papers. After spraying, papers with four droplet deposition patterns (varying droplet spectra and densities) were analyzed manually and by means of the following computer programs: CIR, e-Sprinkle, DepositScan and Conta-Gotas. The diameter of the volume and number medians and the number of droplets per target area were studied. There is a strong correlation between the values measured using the different programs and the manual analysis, but there is a great difference between the numerical values measured for the same paper. Thus, it is not advisable to compare results obtained from different programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis concentrates on the validation of a generic thermal hydraulic computer code TRACE under the challenges of the VVER-440 reactor type. The code capability to model the VVER-440 geometry and thermal hydraulic phenomena specific to this reactor design has been examined and demonstrated acceptable. The main challenge in VVER-440 thermal hydraulics appeared in the modelling of the horizontal steam generator. The major challenge here is not in the code physics or numerics but in the formulation of a representative nodalization structure. Another VVER-440 specialty, the hot leg loop seals, challenges the system codes functionally in general, but proved readily representable. Computer code models have to be validated against experiments to achieve confidence in code models. When new computer code is to be used for nuclear power plant safety analysis, it must first be validated against a large variety of different experiments. The validation process has to cover both the code itself and the code input. Uncertainties of different nature are identified in the different phases of the validation procedure and can even be quantified. This thesis presents a novel approach to the input model validation and uncertainty evaluation in the different stages of the computer code validation procedure. This thesis also demonstrates that in the safety analysis, there are inevitably significant uncertainties that are not statistically quantifiable; they need to be and can be addressed by other, less simplistic means, ultimately relying on the competence of the analysts and the capability of the community to support the experimental verification of analytical assumptions. This method completes essentially the commonly used uncertainty assessment methods, which are usually conducted using only statistical methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the present study was to measure full epidermal thickness, stratum corneum thickness, rete length, dermal papilla widening and suprapapillary epidermal thickness in psoriasis patients using a light microscope and computer-supported image analysis. The data obtained were analyzed in terms of patient age, type of psoriasis, total body surface area involvement, scalp and nail involvement, duration of psoriasis, and family history of the disease. The study was conducted on 64 patients and 57 controls whose skin biopsies were examined by light microscopy. The acquired microscopic images were transferred to a computer and measurements were made using image analysis. The skin biopsies, taken from different body areas, were examined for different parameters such as epidermal, corneal and suprapapillary epidermal thickness. The most prominent increase in thickness was detected in the palmar region. Corneal thickness was more pronounced in patients with scalp involvement than in patients without scalp involvement (t = -2.651, P = 0.008). The most prominent increase in rete length was observed in the knees (median: 491 µm, t = 10.117, P = 0.000). The difference in rete length between patients with a positive and a negative family history was significant (t = -3.334, P = 0.03), being 27% greater in psoriasis patients without a family history. The differences in dermal papilla distances among patients were very small. We conclude that microscope-supported thickness measurements provide objective results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hur arbetar en framgångsrik programmerare? Uppgifterna att programmera datorspel och att programmera industriella, säkerhetskritiska system verkar tämligen olika. Genom en noggrann empirisk undersökning jämför och kontrasterar avhandlingen dessa två former av programmering och visar att programmering innefattar mer än teknisk förmåga. Med utgångspunkt i hermeneutisk och retorisk teori och med hjälp av både kulturvetenskap och datavetenskap visar avhandlingen att programmerarnas tradition och värderingar är grundläggande för deras arbete, och att båda sorter av programmering kan uppfattas och analyseras genom klassisk texttolkningstradition. Dessutom kan datorprogram betraktas och analyseras med hjälp av klassiska teorier om talproduktion i praktiken - program ses då i detta sammanhang som ett slags yttranden. Allt som allt förespråkar avhandlingen en återkomst till vetenskapens grunder, vilka innebär en ständig och oupphörlig cyklisk rörelse mellan att erfara och att förstå. Detta står i kontrast till en reduktionistisk syn på vetenskapen, som skiljer skarpt mellan subjektivt och objektivt, och på så sätt utgår från möjligheten att uppnå fullständigt vetande. Ofullständigt vetande är tolkandets och hermeneutikens domän. Syftet med avhandlingen är att med hjälp av exempel demonstrera programmeringens kulturella, hermeneutiska och retoriska natur.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the world becomes more technologically advanced and economies become globalized, computer science evolution has become faster than ever before. With this evolution and globalization come the need for sustainable university curricula that adequately prepare graduates for life in the industry. Additionally, behavioural skills or “soft” skills have become just as important as technical abilities and knowledge or “hard” skills. The objective of this study was to investigate the current skill gap that exists between computer science university graduates and actual industry needs as well as the sustainability of current computer science university curricula by conducting a systematic literature review of existing publications on the subject as well as a survey of recently graduated computer science students and their work supervisors. A quantitative study was carried out with respondents from six countries, mainly Finland, 31 of the responses came from recently graduated computer science professionals and 18 from their employers. The observed trends suggest that a skill gap really does exist particularly with “soft” skills and that many companies are forced to provide additional training to newly graduated employees if they are to be successful at their jobs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The computer game industry has grown steadily for years, and in revenues it can be compared to the music and film industries. The game industry has been moving to digital distribution. Computer gaming and the concept of business model are discussed among industrial practitioners and the scientific community. The significance of the business model concept has increased in the scientific literature recently, although there is still a lot of discussion going on on the concept. In the thesis, the role of the business model in the computer game industry is studied. Computer game developers, designers, project managers and organization leaders in 11 computer game companies were interviewed. The data was analyzed to identify the important elements of computer game business model, how the business model concept is perceived and how the growth of the organization affects the business model. It was identified that the importance of human capital is crucial to the business. As games are partly a product of creative thinking also innovation and the creative process are highly valued. The same applies to technical skills when performing various activities. Marketing and customer relationships are also considered as key elements in the computer game business model. Financing and partners are important especially for startups, when the organization is dependent on external funding and third party assets. The results of this study provide organizations with improved understanding on how the organization is built and what business model elements are weighted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Today, the user experience and usability in software application are becoming a major design issue due to the adaptation of many processes using new technologies. Therefore, the study of the user experience and usability might be included in every software development project and, thus, they should be tested to get traceable results. As a result of different testing methods to evaluate the concepts, a non-expert on the topic might have doubts on which option he/she should opt for and how to interpret the outcomes of the process. This work aims to create a process to ease the whole testing methodology based on the process created by Seffah et al. and a supporting software tool to follow the procedure of these testing methods for the user experience and usability.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The effects oftwo types of small-group communication, synchronous computer-mediated and face-to-face, on the quantity and quality of verbal output were con^ared. Quantity was deiSned as the number of turns taken per minute, the number of Analysis-of-Speech units (AS-units) produced per minute, and the number ofwords produced per minute. Quality was defined as the number of words produced per AS-unit. In addition, the interaction of gender and type of communication was explored for any differences that existed in the output produced. Questionnaires were also given to participants to determine attitudes toward computer-mediated and face-to-face communication. Thirty intermediate-level students fi-om the Intensive English Language Program (lELP) at Brock University participated in the study, including 15 females and 15 males. Nonparametric tests, including the Wilcoxon matched-pairs test, Mann-Whitney U test, and Friedman test were used to test for significance at the p < .05 level. No significant differences were found in the effects of computer-mediated and face-to-face communication on the output produced during follow-up speaking sessions. However, the quantity and quality of interaction was significantly higher during face-to-face sessions than computer-mediated sessions. No significant differences were found in the output produced by males and females in these 2 conditions. While participants felt that the use of computer-mediated communication may aid in the development of certain language skills, they generally preferred face-to-face communication. These results differed fi-om previous studies that found a greater quantity and quality of output in addition to a greater equality of interaction produced during computer-mediated sessions in comparison to face-to-face sessions (Kern, 1995; Warschauer, 1996).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quantitative component of this study examined the effect of computerassisted instruction (CAI) on science problem-solving performance, as well as the significance of logical reasoning ability to this relationship. I had the dual role of researcher and teacher, as I conducted the study with 84 grade seven students to whom I simultaneously taught science on a rotary-basis. A two-treatment research design using this sample of convenience allowed for a comparison between the problem-solving performance of a CAI treatment group (n = 46) versus a laboratory-based control group (n = 38). Science problem-solving performance was measured by a pretest and posttest that I developed for this study. The validity of these tests was addressed through critical discussions with faculty members, colleagues, as well as through feedback gained in a pilot study. High reliability was revealed between the pretest and the posttest; in this way, students who tended to score high on the pretest also tended to score high on the posttest. Interrater reliability was found to be high for 30 randomly-selected test responses which were scored independently by two raters (i.e., myself and my faculty advisor). Results indicated that the form of computer-assisted instruction (CAI) used in this study did not significantly improve students' problem-solving performance. Logical reasoning ability was measured by an abbreviated version of the Group Assessment of Lx)gical Thinking (GALT). Logical reasoning ability was found to be correlated to problem-solving performance in that, students with high logical reasoning ability tended to do better on the problem-solving tests and vice versa. However, no significant difference was observed in problem-solving improvement, in the laboratory-based instruction group versus the CAI group, for students varying in level of logical reasoning ability.Insignificant trends were noted in results obtained from students of high logical reasoning ability, but require further study. It was acknowledged that conclusions drawn from the quantitative component of this study were limited, as further modifications of the tests were recommended, as well as the use of a larger sample size. The purpose of the qualitative component of the study was to provide a detailed description ofmy thesis research process as a Brock University Master of Education student. My research journal notes served as the data base for open coding analysis. This analysis revealed six main themes which best described my research experience: research interests, practical considerations, research design, research analysis, development of the problem-solving tests, and scoring scheme development. These important areas ofmy thesis research experience were recounted in the form of a personal narrative. It was noted that the research process was a form of problem solving in itself, as I made use of several problem-solving strategies to achieve desired thesis outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study examined coaches' usage of text-based computer-mediated communication (CMC) media (e.g., text-messaging, email) in the coach-player relationship. Data were collected by surveying Ontario-based male baseball coaches (n = 86) who coached players between 15 and 18 years old. Predictions were made regarding how demographic factors such as age and coaching experience affected coaches' CMC use and opinions. Results indicated that over 76% of respondents never used any CMC media other than email and team websites in their interactions with players. Results also revealed that coaches' usage rates contrasted with their opinion of the usefulness of the media, and their perception of players' use of the media. Coaches characterized most CMC media as limited, unnecessary, and sometimes inappropriate. Additional research should explore players' CMC usage rates and possible guidelines for use of the new media in authority relationships. Academia needs to keep pace with the developments in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study compared the relative effectiveness of two computerized remedial reading programs in improving the reading word recognition, rate, and comprehension of adolescent readers demonstrating significant and longstanding reading difficulties. One of the programs involved was Autoskill Component Reading Subskills Program, which provides instruction in isolated letters, syllables, and words, to a point of rapid automatic responding. This program also incorporates reading disability subtypes in its approach. The second program, Read It Again. Sam, delivers a repeated reading strategy. The study also examined the feasibility of using peer tutors in association with these two programs. Grade 9 students at a secondary vocational school who satisfied specific criteria with respect to cognitive and reading ability participated. Eighteen students were randomly assigned to three matched groups, based on prior screening on a battery of reading achievement tests. Two I I groups received training with one of the computer programs; the third group acted as a control and received the remedial reading program offered within the regular classroom. The groups met daily with a trained tutor for approximately 35 minutes, and were required to accumulate twenty hours of instruction. At the conclusion of the program, the pretest battery was repeated. No significant differences were found in the treatment effects of the two computer groups. Each of the two treatment groups was able to effect significantly improved reading word recognition and rate, relative to the control group. Comprehension gains were modest. The treatment groups demonstrated a significant gain, relative to the control group, on one of the three comprehension measures; only trends toward a gain were noted on the remaining two measures. The tutoring partnership appeared to be a viable alternative for the teacher seeking to provide individualized computerized remedial programs for adolescent unskilled readers. Both programs took advantage of computer technology in providing individualized drill and practice, instant feedback, and ongoing recordkeeping. With limited cautions, each of these programs was considered effective and practical for use with adolescent unskilled readers.