783 resultados para web-based maps
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.
Resumo:
The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.
Resumo:
Aim: Rather than being rigid, habitual behaviours may be determined by dynamic mental representations that can adapt to context changes. This adaptive potential may result from particular conditions dependent on the interaction between two sources of mental constructs activation: perceived context applicability and cognitive accessibility . Method: T wo web-shopping simulations of fering the choice between habitually chosen and non-habitually chosen food products were presented to participants. This considered two choice contexts dif fering in the habitual behaviour perceived applicability (low vs. high) and a measure of habitual behaviour chronicity . Results: Study 1 demonstrated a perceived applicability ef fect, with more habitual (non-organic) than non-habitual (organic) food products chosen in a high perceived applicability (familiar) than in a low perceived applicability (new) context. The adaptive potential of habitual behaviour was evident in the habitual products choice consistency across three successive choices, despite the decrease in perceived applicability . Study 2 evidenced the adaptive potential in strong habitual behaviour participants – high chronic accessibility – who chose a habitual product (milk) more than a non-habitual product (orange juice), even when perceived applicability was reduced (new context). Conclusion: Results portray consumers as adaptive decision makers that can flexibly cope with changes in their (inner and outer) choice contexts.
Resumo:
Part 13: Virtual Reality and Simulation
Resumo:
The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members’ sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore’s (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.
Resumo:
As a part of a multidisciplinary and integrated research, including conservation sciences and history, a proposal is presented for the historical reconstitution and the virtual restoration of the mannerist altarpiece of the main altar at the Espírito Santo Church, in Évora. The collected data is abundant and the scientific information, because of its technicality, is less prone to be easily understood by the general public, thus becoming less accessible. Web-based infographics are explored as privileged forms of disseminating results and raising awareness to Cultural Heritage. The project materializes as an Internet platform where data and a reconstitution proposal are shared in a visual and interactive way. In addition to the digital virtual reconstitution (2D), some tridimensional models (3D) are presented of various elements of the altarpiece, obtained using methods of computer graphics and digital photogrammetry
Resumo:
In this thesis the process of building a software for transport accessibility analysis is described. The goal was to create a software which is easy to distribute and simple to use for the user without particular background in the field of the geographical data analysis. It was shown that existing tools do not suit for this particular task due to complex interface or significant rendering time. The goal was accomplished by applying modern approaches in the process of building web applications such as maps based on vector tiles, FLUX architecture design pattern and module bundling. It was discovered that vector tiles have considerable advantages over image-based tiles such as faster rendering and real-time styling.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal
Resumo:
Producing a rich, personalized Web-based consultation tool for plastic surgeons and patients is challenging.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
CysView is a web-based application tool that identifies and classifies proteins according to their disulfide connectivity patterns. It accepts a dataset of annotated protein sequences in various formats and returns a graphical representation of cysteine pairing patterns. CysView displays cysteine patterns for those records in the data with disulfide annotations. It allows the viewing of records grouped by connectivity patterns. CysView's utility as an analysis tool was demonstrated by the rapid and correct classification of scorpion toxin entries from GenPept on the basis of their disulfide pairing patterns. It has proved useful for rapid detection of irrelevant and partial records, or those with incomplete annotations. CysView can be used to support distant homology between proteins. CysView is publicly available at http://research.i2r.a-star.edu.sg/CysView/.