932 resultados para computer system emulation, multiprocessors, educational computer systems
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Introduction: In the Web environment, there is a need for greater care with regard to the processing of descriptive and thematic information. The concern with the recovery of information in computer systems precedes the development of the first personal computers. Models of information retrieval have been and are today widely used in databases specific to a field whose scope is known. Objectives: Verify how the issue of relevance is treated in the main computer models of information retrieval and, especially, as the issue is addressed in the future of the Web, the called Semantic Web. Methodology: Bibliographical research. Results: In the classical models studied here, it was realized that the main concern is retrieving documents whose description is closest to the search expression used by the user, which does not necessarily imply that this really needs. In semantic retrieval is the use of ontologies, feature that extends the user's search for a wider range of possible relevant options. Conclusions: The relevance is a subjective judgment and inherent to the user, it will depend on the interaction with the system and especially the fact that he expects to recover in your search. Systems that are based on a model of relevance are not popular, because it requires greater interaction and depend on the user's disposal. The Semantic Web is so far the initiative more efficient in the case of information retrieval in the digital environment.
Resumo:
The paper presents the radiometric parameters determined by the medical physicist during routine radiotherapy planning service in cases of breast cancer . The contours of the breast volume in patients undergoing radiation breast tumors at the Hospital das Clinicas, Faculty of Medicine , UNESP, Botucatu ( HCFMB ) during the year 2012 were analyzed . In order to analyze the influence of physical and radiometric parameters for the determination of the dose distribution of irradiated breast volume , four measurements of isodose curves were prepared in four different heights breast , and compared with the isodose curves plotted computationally . In the routine of planning , the medical physicist must determine the isodose curve that gives the best dose distribution homogeneity in the irradiated volume . The choice of the treatment plan can be done by dedicated computer systems , which require significantly costly investments available services having better financial support . In the Service of Medical Physics , Department of Radiotherapy , HC FMB , we use a two-dimensional software for determination of isodose curves , however , this software is out of date and frequently becomes inoperable due to the lack of maintenance and it is a closed system without feasibility of interference from computer professionals . This fact requires manual preparation of isodose curves , which are subject to uncertainties due to the subjectivity in the clinical interpretation of medical radiation oncologist and medical physicist responsible for planning , plus dispendiar significant calculation time . The choice of the optimal isodose curve depends on the energy of the radiation beam , the geometry and dimensions of the irradiated area . The contours of the breast studied in this work evaluations showed that , for a given energy input , such as the energy of 1.25 MeV of gamma radiation Unit Telecobaltoterapia , the determination of the percentage depth dose ( PDP ) ...
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
This paper deals with transient stability analysis based on time domain simulation on vector processing. This approach requires the solution of a set of differential equations in conjunction of another set of algebraic equations. The solution of the algebraic equations has presented a scalar as sequential set of tasks, and the solution of these equations, on vector computers, has required much more investigations to speedup the simulations. Therefore, the main objective of this paper has been to present methods to solve the algebraic equations using vector processing. The results, using a GRAY computer, have shown that on-line transient stability assessment is feasible.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
One problem with using component-based software development approach is that once software modules are reused over generations of products, they form legacy structures that can be challenging to understand, making validating these systems difficult. Therefore, tools and methodologies that enable engineers to see interactions of these software modules will enhance their ability to make these software systems more dependable. To address this need, we propose SimSight, a framework to capture dynamic call graphs in Simics, a widely adopted commercial full-system simulator. Simics is a software system that simulates complete computer systems. Thus, it performs nearly identical tasks to a real system but at a much lower speed while providing greater execution observability. We have implemented SimSight to generate dynamic call graphs of statically and dynamically linked functions in x86/Linux environment. A case study illustrates how we can use SimSight to identify sources of software errors. We then evaluate its performance using 12 integer programs from SPEC CPU2006 benchmark suite.
Resumo:
An important feature in computer systems developed for the agricultural sector is to satisfy the heterogeneity of data generated in different processes. Most problems related with this heterogeneity arise from the lack of standard for different computing solutions proposed. An efficient solution for that is to create a single standard for data exchange. The study on the actual process involved in cotton production was based on a research developed by the Brazilian Agricultural Research Corporation (EMBRAPA) that reports all phases as a result of the compilation of several theoretical and practical researches related to cotton crop. The proposition of a standard starts with the identification of the most important classes of data involved in the process, and includes an ontology that is the systematization of concepts related to the production of cotton fiber and results in a set of classes, relations, functions and instances. The results are used as a reference for the development of computational tools, transforming implicit knowledge into applications that support the knowledge described. This research is based on data from the Midwest of Brazil. The choice of the cotton process as a study case comes from the fact that Brazil is one of the major players and there are several improvements required for system integration in this segment.
Resumo:
Die chronisch obstruktive Lungenerkrankung (engl. chronic obstructive pulmonary disease, COPD) ist ein Überbegriff für Erkrankungen, die zu Husten, Auswurf und Dyspnoe (Atemnot) in Ruhe oder Belastung führen - zu diesen werden die chronische Bronchitis und das Lungenemphysem gezählt. Das Fortschreiten der COPD ist eng verknüpft mit der Zunahme des Volumens der Wände kleiner Luftwege (Bronchien). Die hochauflösende Computertomographie (CT) gilt bei der Untersuchung der Morphologie der Lunge als Goldstandard (beste und zuverlässigste Methode in der Diagnostik). Möchte man Bronchien, eine in Annäherung tubuläre Struktur, in CT-Bildern vermessen, so stellt die geringe Größe der Bronchien im Vergleich zum Auflösungsvermögen eines klinischen Computertomographen ein großes Problem dar. In dieser Arbeit wird gezeigt wie aus konventionellen Röntgenaufnahmen CT-Bilder berechnet werden, wo die mathematischen und physikalischen Fehlerquellen im Bildentstehungsprozess liegen und wie man ein CT-System mittels Interpretation als lineares verschiebungsinvariantes System (engl. linear shift invariant systems, LSI System) mathematisch greifbar macht. Basierend auf der linearen Systemtheorie werden Möglichkeiten zur Beschreibung des Auflösungsvermögens bildgebender Verfahren hergeleitet. Es wird gezeigt wie man den Tracheobronchialbaum aus einem CT-Datensatz stabil segmentiert und mittels eines topologieerhaltenden 3-dimensionalen Skelettierungsalgorithmus in eine Skelettdarstellung und anschließend in einen kreisfreien Graphen überführt. Basierend auf der linearen System Theorie wird eine neue, vielversprechende, integral-basierte Methodik (IBM) zum Vermessen kleiner Strukturen in CT-Bildern vorgestellt. Zum Validieren der IBM-Resultate wurden verschiedene Messungen an einem Phantom, bestehend aus 10 unterschiedlichen Silikon Schläuchen, durchgeführt. Mit Hilfe der Skelett- und Graphendarstellung ist ein Vermessen des kompletten segmentierten Tracheobronchialbaums im 3-dimensionalen Raum möglich. Für 8 zweifach gescannte Schweine konnte eine gute Reproduzierbarkeit der IBM-Resultate nachgewiesen werden. In einer weiteren, mit IBM durchgeführten Studie konnte gezeigt werden, dass die durchschnittliche prozentuale Bronchialwandstärke in CT-Datensätzen von 16 Rauchern signifikant höher ist, als in Datensätzen von 15 Nichtrauchern. IBM läßt sich möglicherweise auch für Wanddickenbestimmungen bei Problemstellungen aus anderen Arbeitsgebieten benutzen - kann zumindest als Ideengeber dienen. Ein Artikel mit der Beschreibung der entwickelten Methodik und der damit erzielten Studienergebnisse wurde zur Publikation im Journal IEEE Transactions on Medical Imaging angenommen.
Resumo:
In computer systems, specifically in multithread, parallel and distributed systems, a deadlock is both a very subtle problem - because difficult to pre- vent during the system coding - and a very dangerous one: a deadlocked system is easily completely stuck, with consequences ranging from simple annoyances to life-threatening circumstances, being also in between the not negligible scenario of economical losses. Then, how to avoid this problem? A lot of possible solutions has been studied, proposed and implemented. In this thesis we focus on detection of deadlocks with a static program analysis technique, i.e. an analysis per- formed without actually executing the program. To begin, we briefly present the static Deadlock Analysis Model devel- oped for coreABS−− in chapter 1, then we proceed by detailing the Class- based coreABS−− language in chapter 2. Then, in Chapter 3 we lay the foundation for further discussions by ana- lyzing the differences between coreABS−− and ASP, an untyped Object-based calculi, so as to show how it can be possible to extend the Deadlock Analysis to Object-based languages in general. In this regard, we explicit some hypotheses in chapter 4 first by present- ing a possible, unproven type system for ASP, modeled after the Deadlock Analysis Model developed for coreABS−−. Then, we conclude our discussion by presenting a simpler hypothesis, which may allow to circumvent the difficulties that arises from the definition of the ”ad-hoc” type system discussed in the aforegoing chapter.
Resumo:
A web service is a collection of industry standards to enable reusability of services and interoperability of heterogeneous applications. The UMLS Knowledge Source (UMLSKS) Server provides remote access to the UMLSKS and related resources. We propose a Web Services Architecture that encapsulates UMLSKS-API and makes it available in distributed and heterogeneous environments. This is the first step towards intelligent and automatic UMLS services discovery and invocation by computer systems in distributed environments such as web.