883 resultados para pacs: information technolgy applications
Resumo:
In a statistical inference scenario, the estimation of target signal or its parameters is done by processing data from informative measurements. The estimation performance can be enhanced if we choose the measurements based on some criteria that help to direct our sensing resources such that the measurements are more informative about the parameter we intend to estimate. While taking multiple measurements, the measurements can be chosen online so that more information could be extracted from the data in each measurement process. This approach fits well in Bayesian inference model often used to produce successive posterior distributions of the associated parameter. We explore the sensor array processing scenario for adaptive sensing of a target parameter. The measurement choice is described by a measurement matrix that multiplies the data vector normally associated with the array signal processing. The adaptive sensing of both static and dynamic system models is done by the online selection of proper measurement matrix over time. For the dynamic system model, the target is assumed to move with some distribution and the prior distribution at each time step is changed. The information gained through adaptive sensing of the moving target is lost due to the relative shift of the target. The adaptive sensing paradigm has many similarities with compressive sensing. We have attempted to reconcile the two approaches by modifying the observation model of adaptive sensing to match the compressive sensing model for the estimation of a sparse vector.
Resumo:
Lava flow modeling can be a powerful tool in hazard assessments; however, the ability to produce accurate models is usually limited by a lack of high resolution, up-to-date Digital Elevation Models (DEMs). This is especially obvious in places such as Kilauea Volcano (Hawaii), where active lava flows frequently alter the terrain. In this study, we use a new technique to create high resolution DEMs on Kilauea using synthetic aperture radar (SAR) data from the TanDEM-X (TDX) satellite. We convert raw TDX SAR data into a geocoded DEM using GAMMA software [Werner et al., 2000]. This process can be completed in several hours and permits creation of updated DEMs as soon as new TDX data are available. To test the DEMs, we use the Harris and Rowland [2001] FLOWGO lava flow model combined with the Favalli et al. [2005] DOWNFLOW model to simulate the 3-15 August 2011 eruption on Kilauea's East Rift Zone. Results were compared with simulations using the older, lower resolution 2000 SRTM DEM of Hawaii. Effusion rates used in the model are derived from MODIS thermal infrared satellite imagery. FLOWGO simulations using the TDX DEM produced a single flow line that matched the August 2011 flow almost perfectly, but could not recreate the entire flow field due to the relatively high DEM noise level. The issues with short model flow lengths can be resolved by filtering noise from the DEM. Model simulations using the outdated SRTM DEM produced a flow field that followed a different trajectory to that observed. Numerous lava flows have been emplaced at Kilauea since the creation of the SRTM DEM, leading the model to project flow lines in areas that have since been covered by fresh lava flows. These results show that DEMs can quickly become outdated on active volcanoes, but our new technique offers the potential to produce accurate, updated DEMs for modeling lava flow hazards.
Resumo:
Tumoral gastrin-releasing peptide (GRP) receptors are potential targets for diagnosis and therapy using radiolabeled or cytotoxic GRP analogs. GRP-receptor overexpression has been detected in endocrine-related cancer cells and, more recently, also in the vascular bed of selected tumors. More information on vascular GRP-receptors in cancer is required to asses their potential for vascular targeting applications. Therefore, frequent human cancers (n = 368) were analyzed using in vitro GRP-receptor autoradiography on tissue sections with the (125)I-[Tyr(4)]-bombesin radioligand and/or the universal radioligand (125)I-[d-Tyr(6), beta-Ala(11), Phe(13), Nle(14)]-bombesin(6-14). GRP-receptor expressing vessels were evaluated in each tumor group for prevalence, quantity (vascular score), and GRP-receptor density. Prevalence of vascular GRP-receptors was variable, ranging from 12% (prostate cancer) to 92% (urinary tract cancer). Different tumor types within a given site had divergent prevalence of vascular GRP-receptors (e.g. lung: small cell cancer: 0%; adenocarcinoma: 59%; squamous carcinoma: 83%). Also the vascular score varied widely, with the highest score in urinary tract cancer (1.69), moderate scores in lung (0.91), colon (0.88), kidney (0.84), and biliary tract (0.69) cancers and low scores in breast (0.39) and prostate (0.14) cancers. Vascular GRP-receptors were expressed in the muscular vessel wall in moderate to high densities. Normal non-neoplastic control tissues from these organs lacked vascular GRP-receptors. In conclusion, tumoral vessels in all evaluated sites express GRP-receptors, suggesting a major biological function of GRP-receptors in neovasculature. Vascular GRP-receptor expression varies between the tumor types indicating tumor-specific mechanisms in their regulation. Urinary tract cancers express vascular GRP-receptors so abundantly, that they are promising candidates for vascular targeting applications.
Resumo:
PURPOSE: To assess the literature on accuracy and clinical performance of computer technology applications in surgical implant dentistry. MATERIALS AND METHODS: Electronic and manual literature searches were conducted to collect information about (1) the accuracy and (2) clinical performance of computer-assisted implant systems. Meta-regression analysis was performed for summarizing the accuracy studies. Failure/complication rates were analyzed using random-effects Poisson regression models to obtain summary estimates of 12-month proportions. RESULTS: Twenty-nine different image guidance systems were included. From 2,827 articles, 13 clinical and 19 accuracy studies were included in this systematic review. The meta-analysis of the accuracy (19 clinical and preclinical studies) revealed a total mean error of 0.74 mm (maximum of 4.5 mm) at the entry point in the bone and 0.85 mm at the apex (maximum of 7.1 mm). For the 5 included clinical studies (total of 506 implants) using computer-assisted implant dentistry, the mean failure rate was 3.36% (0% to 8.45%) after an observation period of at least 12 months. In 4.6% of the treated cases, intraoperative complications were reported; these included limited interocclusal distances to perform guided implant placement, limited primary implant stability, or need for additional grafting procedures. CONCLUSION: Differing levels and quantity of evidence were available for computer-assisted implant placement, revealing high implant survival rates after only 12 months of observation in different indications and a reasonable level of accuracy. However, future long-term clinical data are necessary to identify clinical indications and to justify additional radiation doses, effort, and costs associated with computer-assisted implant surgery.
Resumo:
Two technical solutions using single or dual shot offer different advantages and disadvantages for dual energy subtraction. The principles of these are explained and the main clinical applications with results are demonstrated. Elimination of overlaying bone and proof or exclusion of calcification are the primary aims of energy subtraction chest radiography, offering unique information in different clinical situations.
Resumo:
Enterprise Applications are complex software systems that manipulate much persistent data and interact with the user through a vast and complex user interface. In particular applications written for the Java 2 Platform, Enterprise Edition (J2EE) are composed using various technologies such as Enterprise Java Beans (EJB) or Java Server Pages (JSP) that in turn rely on languages other than Java, such as XML or SQL. In this heterogeneous context applying existing reverse engineering and quality assurance techniques developed for object-oriented systems is not enough. Because those techniques have been created to measure quality or provide information about one aspect of J2EE applications, they cannot properly measure the quality of the entire system. We intend to devise techniques and metrics to measure quality in J2EE applications considering all their aspects and to aid their evolution. Using software visualization we also intend to inspect to structure of J2EE applications and all other aspects that can be investigate through this technique. In order to do that we also need to create a unified meta-model including all elements composing a J2EE application.
Resumo:
In rapidly evolving domains such as Computer Assisted Orthopaedic Surgery (CAOS) emphasis is often put first on innovation and new functionality, rather than in developing the common infrastructure needed to support integration and reuse of these innovations. In fact, developing such an infrastructure is often considered to be a high-risk venture given the volatility of such a domain. We present CompAS, a method that exploits the very evolution of innovations in the domain to carry out the necessary quantitative and qualitative commonality and variability analysis, especially in the case of scarce system documentation. We show how our technique applies to the CAOS domain by using conference proceedings as a key source of information about the evolution of features in CAOS systems over a period of several years. We detect and classify evolution patterns to determine functional commonality and variability. We also identify non-functional requirements to help capture domain variability. We have validated our approach by evaluating the degree to which representative test systems can be covered by the common and variable features produced by our analysis.
Resumo:
Code queries focus mainly on the static structure of a system. To comprehend the dynamic behavior of a system however, a software engineer needs to be able to reason about the dynamics of this system, for instance by querying a database of dynamic information. Such a querying mechanism should be directly available in the IDE where the developers implements, navigates and reasons about the software system. We propose (i) concepts to gather dynamic information, (ii) the means to query this information, and (iii) tools and techniques to integrate querying of dynamic information in the IDE, including the presentation of results generated by queries.
Resumo:
The use of virtual learning environments in Higher Education (HE) has been growing in Portugal, driven by the Bologna Process. An example is the use of Learning Management Systems (LMS) that translates an opportunity to leverage the use of technological advances in the educational process. The progress of information and communication technologies (ICT) coupled with the great development of Internet has brought significant challenges to educators that require a thorough knowledge of their implementation process. These field notes present the results of a survey among teachers of a private HE institution in its use of Moodle as a tool to support face-to-face teaching. A research methodology essentially of exploratory nature based on a questionnaire survey, supported by statistical treatment allowed to detect motivations, type of use and perceptions of teachers in relation to this kind of tool. The results showed that most teachers, by a narrow margin (58%), had not changed their pedagogical practice as a consequence of using Moodle. Among those that did 67% attended institutional internal training. Some of the results obtained suggest further investigation and provide guidelines to plan future internal training.
Resumo:
Der vorliegende Übersichtsartikel betrachtet Mobile Learning aus einer pädagogisch-psychologischen und didaktischen Perspektive. Mobile Learning (M-Learning), das seit Mitte der 1990er in unterschiedlichsten Kontexten Einzug in den Bildungssektor hielt, ist ein dynamisches und interdisziplinäres Feld. Dynamisch, weil M-Learning durch die rasche Entwicklung im Bereich der Informations- und Kommunikationstechnologie, wie kaum ein anderes Forschungsfeld, einem derart großen Wandel unterworfen ist. Interdisziplinär, weil durch das Zusammentreffen von mobiler Technik und Lernen auch unterschiedliche Fachdisziplinen betroffen sind. Die verschiedenen Sichtweisen und auch die Komplexität des Feldes haben dazu geführt, dass bis heute keine einheitliche Definition des Begriffs besteht. Ziel dieses Übersichtsartikels ist es, den aktuellen Forschungsstand aus didaktischer und pädagogisch-psychologischer Sicht aufzuzeigen. Dazu werden zunächst wichtige Komponenten des M-Learning-Begriffs herausgearbeitet und daran anschließend didaktisch bedeutsame theoretische Ansätze und Modelle vorgestellt sowie kritisch betrachtet. Basierend auf dieser theoretischen Ausgangslage wird dann ein Rahmen gezeichnet, der verdeutlichen soll, wo empirische Forschung aus didaktischer und pädagogisch-psychologischer Sicht ansetzen kann. Entsprechende empirische Studien werden ebenfalls vorgestellt, um einen Eindruck des aktuellen empirischen Forschungsstandes zu geben. Dies alles soll als Ausgangspunkt für den zukünftigen Forschungsbedarf dienen.
Resumo:
Internetbasierte Jobportale liefern in Form von Stellenanzeigen eine interessante Datengrundlage, um Qualifikationsanforderungen von nachfragenden Unternehmen an potenzielle Hochschulabsolventen transparent zu machen. Hochschulen können durch Analyse dieser Qualifikationsanforderungen das eigene Aus- und Weiterbildungsangebot arbeitsmarktorientiert weiterentwickeln und sich somit in der Hochschullandschaft profilieren. Hierfür ist es indes erforderlich, die Stellenanzeigen aus Jobportalen zu extrahieren und mithilfe adäquater analytischer Informationssysteme weiter zu verarbeiten. In diesem Beitrag zum CampusSource White Paper Award wird ein Konzept für Job Intelligence-Services vorgestellt, die die systematische Analyse von Qualifikationsanforderungen auf Grundlage von Stellenanzeigen aus Jobportalen gestatten.
Resumo:
Teaching is a dynamic activity. It can be very effective, if its impact is constantly monitored and adjusted to the demands of changing social contexts and needs of learners. This implies that teachers need to be aware about teaching and learning processes. Moreover, they should constantly question their didactical methods and the learning resources, which they provide to their students. They should reflect if their actions are suitable, and they should regulate their teaching, e.g., by updating learning materials based on new knowledge about learners, or by motivating learners to engage in further learning activities. In the last years, a rising interest in ‘learning analytics’ is observable. This interest is motivated by the availability of massive amounts of educational data. Also, the continuously increasing processing power, and a strong motivation for discovering new information from these pools of educational data, is pushing further developments within the learning analytics research field. Learning analytics could be a method for reflective teaching practice that enables and guides teachers to investigate and evaluate their work in future learning scenarios. However, this potentially positive impact has not yet been sufficiently verified by learning analytics research. Another method that pursues these goals is ‘action research’. Learning analytics promises to initiate action research processes because it facilitates awareness, reflection and regulation of teaching activities analogous to action research. Therefore, this thesis joins both concepts, in order to improve the design of learning analytics tools. Central research question of this thesis are: What are the dimensions of learning analytics in relation to action research, which need to be considered when designing a learning analytics tool? How does a learning analytics dashboard impact the teachers of technology-enhanced university lectures regarding ‘awareness’, ‘reflection’ and ‘action’? Does it initiate action research? Which are central requirements for a learning analytics tool, which pursues such effects? This project followed design-based research principles, in order to answer these research questions. The main contributions are: a theoretical reference model that connects action research and learning analytics, the conceptualization and implementation of a learning analytics tool, a requirements catalogue for useful and usable learning analytics design based on evaluations, a tested procedure for impact analysis, and guidelines for the introduction of learning analytics into higher education.