947 resultados para 005 Computer programming, programs
Resumo:
Concurrent programs are hard to test due to the inherent nondeterminism. This paper presents a method and tool support for testing concurrent Java components. Too[ support is offered through ConAn (Concurrency Analyser), a too] for generating drivers for unit testing Java classes that are used in a multithreaded context. To obtain adequate controllability over the interactions between Java threads, the generated driver contains threads that are synchronized by a clock. The driver automatically executes the calls in the test sequence in the prescribed order and compares the outputs against the expected outputs specified in the test sequence. The method and tool are illustrated in detail on an asymmetric producer-consumer monitor. Their application to testing over 20 concurrent components, a number of which are sourced from industry and were found to contain faults, is presented and discussed.
Resumo:
A questionnaire on lectures was completed by 351 students (84% response) and 35 staff (76% response) from all five years of the veterinary course at the University of Queensland. Staff and students in all five years offered limited support for a reduction in the number of lectures in the course and the majority supported a reduction in the number of lectures in the clinical years. Students in the clinical years only and appropriate staff agreed that the number of lectures in fifth year should be reduced but were divided as to whether lectures in fifth year should be abolished. There was limited support for replacement of some lectures by computer assisted learning (CAL) programs, but strong support for replacement of some lectures by subject-based problem based learning (PBL) and strong support for more self-directed learning by students. Staff and students strongly supported the inclusion of more clinical problem solving in lectures in the clinical years and wanted these lectures to be more interactive. There was little support for lectures in the clinical years to be of the same type as in the preclinical years.
Resumo:
Aim To assess the effectiveness of a program of computer-generated tailored advice for callers to a telephone helpline, and to assess whether it enhanced a series of callback telephone counselling sessions in aiding smoking cessation. Design Randomized controlled trial comparing: (1) untailored self-help materials; (2) computer-generated tailored advice only, and (3) computer-generated tailored advice plus callback telephone counselling. Assessment surveys were conducted at baseline, 3, 6 and 12 months. Setting Victoria, Australia. Participants A total of 1578 smokers who called the Quitline service and agreed to participate. Measurements Smoking status at follow-up; duration of cessation, if quit; use of nicotine replacement therapy; and extent of participation in the callback service. Findings At the 3-month follow-up, significantly more (chi(2)(2) = 16.9; P < 0.001) participants in the computer-generated tailored advice plus telephone counselling condition were not smoking (21%) than in either the computer-generated advice only (12%) or the control condition (12%). Proportions reporting not smoking at the 12-month follow-up were 26%, 23% and 22%, respectively (NS) for point prevalence, and for 9 months sustained abstinence; 8.2, 6.0, and 5.0 (NS). In the telephone counselling group, those receiving callbacks were more likely than those who did not to have sustained abstinence at 12 months (10.2 compared with 4.0, P < 0.05). Logistic regression on 3-month data showed significant independent effects on cessation of telephone counselling and use of NRT, but not of computer-generated tailored advice. Conclusion Computer-generated tailored advice did not enhance telephone counselling, nor have any independent effect on cessation. This may be due to poor timing of the computer-generated tailored advice and poor integration of the two modes of advice.
Resumo:
A pesquisa analisa da constituição histórica da disciplina História da Educação ministrada na Faculdade de Filosofia Ciências e Letras do Estado do EspÃrito Santo, posteriormente incorporada a Universidade Federal do EspÃrito Santo entre os anos de 1951 e 2000. Investiga a constituição histórica da disciplina, as transformações programáticas, legais e institucionais referentes à disciplina de História da Educação, como também as abordagens historiográficas, periodizações e os conceitos de tempo, história e educação. A fundamentação teórica e metodológica articula-se dialogicamente a partir das construções conceituais e metodológicas de Carlo Ginzburg e Mikhail Bakhtin. A partir dos conceitos de polifonia e dialogismo, comum a ambos, investigou-se as vozes e diálogos impressos nas narrativas da disciplina de História da Educação e seu ensino, sejam em camadas mais superficiais ou profundas, encontradas no corpus documental consultado e analisado, que correspondem a: programas de ensino, transparências, leis, estruturas curriculares, documentos de departamento; resenhas e fichamentos de textos, bibliografia obrigatória e complementar, avaliações e entrevistas. Procurou-se no corpus documental dados aparentemente negligenciáveis – pistas, indÃcios e sinais – remontar uma realidade histórica complexa e não experimentável diretamente. Ao investigar historicamente a trajetória da disciplina História da Educação e seu ensino a partir dos parâmetros legais, programáticos e institucionais, foi possÃvel perceber que as mudanças mais profundas operadas na disciplina não se originam das legislações e reestruturações curriculares, mas dos locais de produção e socialização do conhecimento histórico. Durante o perÃodo analisado, as duas esferas de produção historiográficas que mais influenciaram nas abordagens, periodizações e conceitos de tempo, história e educação da disciplina História da Educação do curso de pedagogia pesquisado foram: a editora responsável pela publicação e divulgação dos Manuais de História da Educação da coleção Atualidades Pedagógicas (1951-1979) e os Programas de Pós-graduação em Educação e História (1980 - 2000). Entre 1951 e finais de 1970 observa-se a influência dos Manuais de História da Educação, na organização e programação do ensino de História da Educação e uma abordagem filosófica voltada para a história das ideias pedagógicas e análises do pensamento de filósofos e educadores sobre a educação e respectivas inserções em doutrinas filosóficas europeias. A partir de 1980 as abordagens de cunho econômico, polÃtico e ideológico dos contextos históricos educativos passaram a predominar nos programas de ensino das disciplinas de História da Educação I e II, e vigoraram até meados nos anos de 1990. Na disciplina de História da Educação I a abordagem é marcada por análises do contexto de produção e organização das classes sociais; com relação à disciplina História da Educação II, até meados de 1995, trata da educação brasileira. A partir da abordagem fundamentada na Teoria da Dependência após 1995, os documentos consultados começam a mostrar outras marcas que sugerem uma abordagem voltada para a dimensão polÃtica e social, abordando a História da Educação Brasileira, a partir dos movimentos sociais e seus respectivos projetos educacionais.
Resumo:
A adoção da ação afirmativa denominada cotas nas universidades federais - reserva de vagas para estudantes que tenham cursado integralmente o ensino médio em escolas públicas - continua sendo polêmica, mesmo após a sanção da Lei nº.12.711 de 2012, o que torna oportuna a contribuição aos estudos a respeito da utilização desse sistema nas universidades públicas brasileiras. Este trabalho teve por objetivo estudar o desempenho acadêmico de alunos cotistas do Centro de Ciências JurÃdicas e Econômicas e do Centro Tecnológico da Universidade Federal do EspÃrito Santo – UFES, no perÃodo de 2008 a 2013, considerando as duas entradas anuais nos referidos cursos. O presente estudo analisou o aproveitamento acadêmico dos alunos de 15 cursos de graduação ofertados pelos Centros nominados buscando saber em quais cursos e disciplinas existem diferenças significativas de desempenho, a partir do coeficiente de rendimento acadêmico (CRA) e da média final das disciplinas cursadas por alunos cotistas e não cotistas, visando a propor ações institucionais para a redução dessas diferenças. Na pesquisa, de caráter quantitativo, utilizou-se o método estatÃstico de análise de variância ANOVA. A partir das análises realizadas foi possÃvel inferir a existência de diferenças de rendimento nos cursos de engenharia,principalmente em disciplinas de cálculo e álgebra. O curso de Direito, por sua vez, apresentou diferença significativa de desempenho, não obstante as médias não estarem abaixo do Ãndice necessário para a aprovação. Quando a comparação é feita considerando o sexo, verificou-se no curso de Ciências Econômicas e Ciências da Computação uma disparidade significativa de rendimento em favor dos alunos do sexo feminino cotistas em detrimento dos não cotistas do mesmo sexo.
Resumo:
Experimental scratch resistance testing provides two numbers: the penetration depth Rp and the healing depth Rh. In molecular dynamics computer simulations, we create a material consisting of N statistical chain segments by polymerization; a reinforcing phase can be included. Then we simulate the movement of an indenter and response of the segments during X time steps. Each segment at each time step has three Cartesian coordinates of position and three of momentum. We describe methods of visualization of results based on a record of 6NX coordinates. We obtain a continuous dependence on time t of positions of each of the segments on the path of the indenter. Scratch resistance at a given location can be connected to spatial structures of individual polymeric chains.
Resumo:
We have employed molecular dynamics simulations to study the behavior of virtual polymeric materials under an applied uniaxial tensile load. Through computer simulations, one can obtain experimentally inaccessible information about phenomena taking place at the molecular and microscopic levels. Not only can the global material response be monitored and characterized along time, but the response of macromolecular chains can be followed independently if desired. The computer-generated materials were created by emulating the step-wise polymerization, resulting in self-avoiding chains in 3D with controlled degree of orientation along a certain axis. These materials represent a simplified model of the lamellar structure of semi-crystalline polymers,being comprised of an amorphous region surrounded by two crystalline lamellar regions. For the simulations, a series of materials were created, varying i) the lamella thickness, ii) the amorphous region thickness, iii) the preferential chain orientation, and iv) the degree of packing of the amorphous region. Simulation results indicate that the lamella thickness has the strongest influence on the mechanical properties of the lamella-amorphous structure, which is in agreement with experimental data. The other morphological parameters also affect the mechanical response, but to a smaller degree. This research follows previous simulation work on the crack formation and propagation phenomena, deformation mechanisms at the nanoscale, and the influence of the loading conditions on the material response. Computer simulations can improve the fundamental understanding about the phenomena responsible for the behavior of polymeric materials, and will eventually lead to the design of knowledge-based materials with improved properties.
Resumo:
Bone weakening can occur due to the absence of load on the skeleton or even short periods of decreased physical activity. Therefore, musculoskeletal diseases that involve temporary immobilization by casts, inactivity or tension increases the risk of fractures. Physical activity is the most studied procedure both to prevent damage and to restore bone structure. The present study aimed at evaluating, by bone densitometry on rat femurs, the influence of hindlimb unloading and later running activity on treadmill or free movement. Sixty-four Wistar rats were used, aged 65 days with a mean corporal mass of 316.11g, randomly divided into eight experimental groups: group 1, the suspended control with seven animals under hindlimb unloading regimen for 28 days, then euthanized; groups 2 and 3, the trained suspended comprising of 7 and five animals, respectively, subjected to hindlimb unloading for 28 days, followed by treadmill exercise for 28 days (group 2) or 56 days (group 3), then euthanized; groups 4 and 5, designated free suspended, comprised of 7 animals each under hindlimb unloading regimen for 28 days followed by free activity in cages for 28 days (group 4) or 56 days (group 5), then euthanized; groups 6, 7 and 8, negative controls, each with 8 animals allowed to free activity in cages and euthanized at the ages of 93, 121 and 149 days, respectively. Bone mineral density (BMD) of the left femur was analyzed by bone densitometry. Unloading by tail-suspension decreased BMD while treadmill training and free activity in cages promoted its recovery in a similar way and over time.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
Over the last decade, software architecture emerged as a critical issue in Software Engineering. This encompassed a shift from traditional programming towards software development based on the deployment and assembly of independent components. The specification of both the overall systems structure and the interaction patterns between their components became a major concern for the working developer. Although a number of formalisms to express behaviour and to supply the indispensable calculational power to reason about designs, are available, the task of deriving architectural designs on top of popular component platforms has remained largely informal. This paper introduces a systematic approach to derive, from CCS behavioural specifications the corresponding architectural skeletons in the Microsoft .Net framework, in the form of executable C and Cω code. The prototyping process is fully supported by a specific tool developed in Haskell
Resumo:
This paper reports on the development of specific slicing techniques for functional programs and their use for the identification of possible coherent components from monolithic code. An associated tool is also introduced. This piece of research is part of a broader project on program understanding and re-engineering of legacy code supported by formal methods
Resumo:
More and more current software systems rely on non trivial coordination logic for combining autonomous services typically running on different platforms and often owned by different organizations. Often, however, coordination data is deeply entangled in the code and, therefore, difficult to isolate and analyse separately. COORDINSPECTOR is a software tool which combines slicing and program analysis techniques to isolate all coordination elements from the source code of an existing application. Such a reverse engineering process provides a clear view of the actually invoked services as well as of the orchestration patterns which bind them together. The tool analyses Common Intermediate Language (CIL) code, the native language of Microsoft .Net Framework. Therefore, the scope of application of COORDINSPECTOR is quite large: potentially any piece of code developed in any of the programming languages which compiles to the .Net Framework. The tool generates graphical representations of the coordination layer together and identifies the underlying business process orchestrations, rendering them as Orc specifications
Resumo:
Clone detection is well established for imperative programs. It works mostly on the statement level and therefore is ill-suited for func- tional programs, whose main constituents are expressions and types. In this paper we introduce clone detection for functional programs using a new intermediate program representation, dubbed Functional Control Tree. We extend clone detection to the identi cation of non-trivial func- tional program clones based on the recursion patterns from the so-called Bird-Meertens formalism