36 resultados para Visualization Using Computer Algebra Tools
Resumo:
Traditionally representation of competencies has been very difficult using computer-based techniques. This paper introduces competencies, how they are represented, and the related concept of competency frameworks and the difficulties in using traditional ontology techniques to formalise them. A “vaguely” formalised framework has been developed within the EU project TRACE and is presented. The framework can be used to represent different competencies and competency frameworks. Through a case study using an example from the IT sector, it is shown how these can be used by individuals and organisations to specify their individual competency needs. Furthermore it is described how these representations are used for comparisons between different specifications applying ontologies and ontology toolsets. The end result is a comparison that is not binary, but tertiary, providing “definite matches”, possible / partial matches, and “no matches” using a “traffic light” analogy.
Resumo:
This paper presents a quantitative evaluation of a tracking system on PETS 2015 Challenge datasets using well-established performance measures. Using the existing tools, the tracking system implements an end-to-end pipeline that include object detection, tracking and post- processing stages. The evaluation results are presented on the provided sequences of both ARENA and P5 datasets of PETS 2015 Challenge. The results show an encouraging performance of the tracker in terms of accuracy but a greater tendency of being prone to cardinality error and ID changes on both datasets. Moreover, the analysis show a better performance of the tracker on visible imagery than on thermal imagery.
Resumo:
Background— T NADPH oxidase, by generating reactive oxygen species, is involved in the pathophysiology of many cardiovascular diseases and represents a therapeutic target for the development of novel drugs. A single-nucleotide polymorphism (SNP) C242T of the p22phox subunit of NADPH oxidase has been reported to be negatively associated with coronary heart disease (CHD) and may predict disease prevalence. However, the underlying mechanisms remain unknown. Methods and Results— Using computer molecular modelling we discovered that C242T SNP causes significant structural changes in the extracellular loop of p22phox and reduces its interaction stability with Nox2 subunit. Gene transfection of human pulmonary microvascular endothelial cells showed that C242T p22phox reduced significantly Nox2 expression but had no significant effect on basal endothelial O2.- production or the expression of Nox1 and Nox4. When cells were stimulated with TNFα (or high glucose), C242T p22phox inhibited significantly TNFα-induced Nox2 maturation, O2.- production, MAPK and NFκB activation and inflammation (all p<0.05). These C242T effects were further confirmed using p22phox shRNA engineered HeLa cells and Nox2-/- coronary microvascular endothelial cells. Clinical significance was investigated using saphenous vein segments from non CHD subjects after phlebectomies. TT (C242T) allele was common (prevalence of ~22%) and compared to CC, veins bearing TT allele had significantly lower levels of Nox2 expression and O2.- generation in response to high glucose challenge. Conclusions— C242T SNP causes p22phox structural changes that inhibit endothelial Nox2 activation and oxidative response to TNFα or high glucose stimulation. C242T SNP may represent a natural protective mechanism against inflammatory cardiovascular diseases.
Resumo:
Information technology in construction (ITC) has been gaining wide acceptance and is being implemented in the construction research domains as a tool to assist decision makers. Most of the research into visualization technologies (VT) has been on the wide range of 3D and simulation applications suitable for construction processes. Despite its development with interoperability and standardization of products, VT usage has remained very low when it comes to communicating and addressing the needs of building end-users (BEU). This paper argues that building end users are a source of experience and expertise that can be brought into the briefing stage for the evaluation of design proposals. It also suggests that the end user is a source of new ideas promoting innovation. In this research a positivistic methodology that includes the comparison of 3D models and the traditional 2D methods is proposed. It will help to identify "how much", if anything, a non-spatial specialist can gain in terms Of "understanding" of a particular design proposal presented, using both methods.
Resumo:
Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.
Resumo:
Medical universities and teaching hospitals in Iraq are facing a lack of professional staff due to the ongoing violence that forces them to flee the country. The professionals are now distributed outside the country which reduces the chances for the staff and students to be physically in one place to continue the teaching and limits the efficiency of the consultations in hospitals. A survey was done among students and professional staff in Iraq to find the problems in the learning and clinical systems and how Information and Communication Technology could improve it. The survey has shown that 86% of the participants use the Internet as a learning resource and 25% for clinical purposes while less than 11% of them uses it for collaboration between different institutions. A web-based collaborative tool is proposed to improve the teaching and clinical system. The tool helps the users to collaborate remotely to increase the quality of the learning system as well as it can be used for remote medical consultation in hospitals.
Resumo:
Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.
Resumo:
In order to gain a better understanding of online conceptual collaborative design processes this paper investigates how student designers make use of a shared virtual synchronous environment when engaged in conceptual design. The software enables users to talk to each other and share sketches when they are remotely located. The paper describes a novel methodology for observing and analysing collaborative design processes by adapting the concepts of grounded theory. Rather than concentrating on narrow aspects of the final artefacts, emerging “themes” are generated that provide a broader picture of collaborative design process and context descriptions. Findings on the themes of “grounding – mutual understanding” and “support creativity” complement findings from other research, while important themes associated with “near-synchrony” have not been emphasised in other research. From the study, a series of design recommendations are made for the development of tools to support online computer-supported collaborative work in design using a shared virtual environment.
Resumo:
Mediterranean ecosystems rival tropical ecosystems in terms of plant biodiversity. The Mediterranean Basin (MB) itself hosts 25 000 plant species, half of which are endemic. This rich biodiversity and the complex biogeographical and political issues make conservation a difficult task in the region. Species, habitat, ecosystem and landscape approaches have been used to identify conservation targets at various scales: ie, European, national, regional and local. Conservation decisions require adequate information at the species, community and habitat level. Nevertheless and despite recent improvements/efforts, this information is still incomplete, fragmented and varies from one country to another. This paper reviews the biogeographic data, the problems arising from current conservation efforts and methods for the conservation assessment and prioritization using GIS. GIS has an important role to play for managing spatial and attribute information on the ecosystems of the MB and to facilitate interactions with existing databases. Where limited information is available it can be used for prediction when directly or indirectly linked to externally built models. As well as being a predictive tool today GIS incorporate spatial techniques which can improve the level of information such as fuzzy logic, geostatistics, or provide insight about landscape changes such as 3D visualization. Where there are limited resources it can assist with identifying sites of conservation priority or the resolution of environmental conflicts (scenario building). Although not a panacea, GIS is an invaluable tool for improving the understanding of Mediterranean ecosystems and their dynamics and for practical management in a region that is under increasing pressure from human impact.
Resumo:
Purpose – The purpose of this paper is to investigate the concepts of intelligent buildings (IBs), and the opportunities offered by the application of computer-aided facilities management (CAFM) systems. Design/methodology/approach – In this paper definitions of IBs are investigated, particularly definitions that are embracing open standards for effective operational change, using a questionnaire survey. The survey further investigated the extension of CAFM to IBs concepts and the opportunities that such integrated systems will provide to facilities management (FM) professionals. Findings – The results showed variation in the understanding of the concept of IBs and the application of CAFM. The survey showed that 46 per cent of respondents use a CAFM system with a majority agreeing on the potential of CAFM in delivery of effective facilities. Research limitations/implications – The questionnaire survey results are limited to the views of the respondents within the context of FM in the UK. Practical implications – Following on the many definitions of an IB does not necessarily lead to technologies of equipment that conform to an open standard. This open standard and documentation of systems produced by vendors is the key to integrating CAFM with other building management systems (BMS) and further harnessing the application of CAFM for IBs. Originality/value – The paper gives experience-based suggestions for both demand and supply sides of the service procurement to gain the feasible benefits and avoid the currently hindering obstacles, as the paper provides insight to the current and future tools for the mobile aspects of FM. The findings are relevant for service providers and operators as well.
Resumo:
This paper illustrates how nonlinear programming and simulation tools, which are available in packages such as MATLAB and SIMULINK, can easily be used to solve optimal control problems with state- and/or input-dependent inequality constraints. The method presented is illustrated with a model of a single-link manipulator. The method is suitable to be taught to advanced undergraduate and Master's level students in control engineering.
Resumo:
The work reported in this paper is motivated towards handling single node failures for parallel summation algorithms in computer clusters. An agent based approach is proposed in which a task to be executed is decomposed to sub-tasks and mapped onto agents that traverse computing nodes. The agents intercommunicate across computing nodes to share information during the event of a predicted node failure. Two single node failure scenarios are considered. The Message Passing Interface is employed for implementing the proposed approach. Quantitative results obtained from experiments reveal that the agent based approach can handle failures more efficiently than traditional failure handling approaches.