971 resultados para Web Front End
Resumo:
This paper presents our Semantic Web portal infrastructure, which focuses on how to enhance knowledge access in traditional Web portals by gathering and exploiting semantic metadata. Special attention is paid to three important issues that affect the performance of knowledge access: i) high quality metadata acquisition, which concerns how to ensure high quality while gathering semantic metadata from heterogeneous data sources; ii) semantic search, which addresses how to meet the information querying needs of ordinary end users who are not necessarily familiar with the problem domain or the supported query language; and iii) semantic browsing, which concerns how to help users understand and explore the problem domain.
Resumo:
Existing semantic search tools have been primarily designed to enhance the performance of traditional search technologies but with little support for ordinary end users who are not necessarily familiar with domain specific semantic data, ontologies, or SQL-like query languages. This paper presents SemSearch, a search engine, which pays special attention to this issue by providing several means to hide the complexity of semantic search from end users and thus make it easy to use and effective.
Resumo:
This paper reports on an experiment of using a publisher provided web-based resource to make available a series of optional practice quizzes and other supplementary material to all students taking a first year introductory microeconomics module. The empirical analysis evaluates the impact these supplementary resources had on student learning. First, we investigate which students decided to make use of the resources. Then, we analyse the impact this decision has on their subsequent performance in the examination at the end of the module. The results show that, even after taking into account the possibility of self-selection bias, using the web-based resource had a significant positive effect on student learning.
Resumo:
This paper concerns the application of recent information technologies for creating a software system for numerical simulations in the domain of plasma physics and in particular metal vapor lasers. The presented work is connected with performing modernization of legacy physics software for reuse on the web and inside a Service-Oriented Architecture environment. Applied and described is the creation of Java front-ends of legacy C++ and FORTRAN codes. Then the transformation of some of the scientific components into web services, as well as the creation of a web interface to the legacy application, is presented. The use of the BPEL language for managing scientific workflows is also considered.
Resumo:
Clinical decision support systems (CDSSs) often base their knowledge and advice on human expertise. Knowledge representation needs to be in a format that can be easily understood by human users as well as supporting ongoing knowledge engineering, including evolution and consistency of knowledge. This paper reports on the development of an ontology specification for managing knowledge engineering in a CDSS for assessing and managing risks associated with mental-health problems. The Galatean Risk and Safety Tool, GRiST, represents mental-health expertise in the form of a psychological model of classification. The hierarchical structure was directly represented in the machine using an XML document. Functionality of the model and knowledge management were controlled using attributes in the XML nodes, with an accompanying paper manual for specifying how end-user tools should behave when interfacing with the XML. This paper explains the advantages of using the web-ontology language, OWL, as the specification, details some of the issues and problems encountered in translating the psychological model to OWL, and shows how OWL benefits knowledge engineering. The conclusions are that OWL can have an important role in managing complex knowledge domains for systems based on human expertise without impeding the end-users' understanding of the knowledge base. The generic classification model underpinning GRiST makes it applicable to many decision domains and the accompanying OWL specification facilitates its implementation.
Resumo:
eHabitat is a Web Processing Service (WPS) designed to compute the likelihood of finding ecosystems with equal properties. Inputs to the WPS, typically thematic geospatial "layers", can be discovered using standardised catalogues, and the outputs tailored to specific end user needs. Because these layers can range from geophysical data captured through remote sensing to socio-economical indicators, eHabitat is exposed to a broad range of different types and levels of uncertainties. Potentially chained to other services to perform ecological forecasting, for example, eHabitat would be an additional component further propagating uncertainties from a potentially long chain of model services. This integration of complex resources increases the challenges in dealing with uncertainty. For such a system, as envisaged by initiatives such as the "Model Web" from the Group on Earth Observations, to be used for policy or decision making, users must be provided with information on the quality of the outputs since all system components will be subject to uncertainty. UncertWeb will create the Uncertainty-Enabled Model Web by promoting interoperability between data and models with quantified uncertainty, building on existing open, international standards. It is the objective of this paper to illustrate a few key ideas behind UncertWeb using eHabitat to discuss the main types of uncertainties the WPS has to deal with and to present the benefits of the use of the UncertWeb framework.
Resumo:
In the years 2004 and 2005 we collected samples of phytoplankton, zooplankton and macroinvertebrates in an artificial small pond in Budapest. We set up a simulation model predicting the abundance of the cyclopoids, Eudiaptomus zachariasi and Ischnura pumilio by considering only temperature as it affects the abundance of population of the previous day. Phytoplankton abundance was simulated by considering not only temperature, but the abundance of the three mentioned groups. This discrete-deterministic model could generate similar patterns like the observed one and testing it on historical data was successful. However, because the model was overpredicting the abundances of Ischnura pumilio and Cyclopoida at the end of the year, these results were not considered. Running the model with the data series of climate change scenarios, we had an opportunity to predict the individual numbers for the period around 2050. If the model is run with the data series of the two scenarios UKHI and UKLO, which predict drastic global warming, then we can observe a decrease in abundance and shift in the date of the maximum abundance occurring (excluding Ischnura pumilio, where the maximum abundance increases and it occurs later), whereas under unchanged climatic conditions (BASE scenario) the change in abundance is negligible. According to the scenarios GFDL 2535, GFDL 5564 and UKTR, a transition could be noticed.
Resumo:
This dissertation explores the role of artillery forward observation teams during the battle of Okinawa (April–June 1945). It addresses a variety of questions associated with this front line artillery support. First, it examines the role of artillery itself in the American victory over the Japanese on Okinawa. Second, it traces the history of the forward observer in the three decades before the end of World War II. Third, it defines the specific role of the forward observation teams during the battle: what they did and how they did it during this three-month duel. Fourth, it deals with the particular problems of the forward observer. These included coordination with the local infantry commander, adjusting to the periodic rotation between the front lines and the artillery battery behind the line of battle, responding to occasional problems with "friendly fire" (American artillery falling on American ground forces), dealing with personnel turnover in the teams (due to death, wounds, and illness), and finally, developing a more informal relationship between officers and enlisted men to accommodate the reality of this recently created combat assignment. Fifth, it explores the experiences of a select group of men who served on (or in proximity to) forward observation teams on Okinawa. Previous scholars and popular historians of the battle have emphasized the role of Marines, infantrymen, and flame-throwing armor. This work offers a different perspective on the battle and it uses new sources as well. A pre-existing archive of interviews with Okinawan campaign forward observer team members conducted in the 1990s forms the core of the oral history component of this research project. The verbal accounts were checked against and supplemented by a review of unit reports obtained from the U.S. National Archives and various secondary sources. The dissertation concludes that an understanding of American artillery observation is critical to a more complete comprehension of the battle of Okinawa. These mid-ranking (and largely middle class) soldiers proved capable of adjusting to the demands of combat conditions. They provide a unique and understudied perspective of the entire battle.
Resumo:
Planktic foraminiferal faunas and modern analogue technique estimates of sea surface temperature (SST) for the last 1 million years (Myr) are compared between core sites to the north (ODP 1125, 178 faunas) and south (DSDP 594, 374 faunas) of the present location of the Subtropical Front (STF), east of New Zealand. Faunas beneath cool subtropical water (STW) north of the STF are dominated by dextral Neogloboquadrina pachyderma, Globorotalia inflata, and Globigerina bulloides, whereas faunas to the south are strongly dominated by sinistral N. pachyderma (80-95% in glacials), with increased G. bulloides (20-50%) and dextral N. pachyderma (15-50%) in interglacials (beneath Subantarctic Water, or SAW). Canonical correspondence analysis indicates that at both sites, SST and related factors were the most important environmental influences on faunal composition. Greater climate-related faunal fluctuations occur in the south. Significant faunal changes occur through time at both sites, particularly towards the end of the mid-Pleistocene climate transition, MIS18-15 (e.g., decline of Globorotalia crassula in STW, disappearance of Globorotalia puncticulata in SAW), and during MIS8-5. Interglacial SST estimates in the north are similar to the present day throughout the last 1 Myr. To the south, interglacial SSTs are more variable with peaks 4-7 °C cooler than present through much of the early and middle Pleistocene, but in MIS11, MIS5.5, and early MIS1, peaks are estimated to have been 2-4 °C warmer than present. These high temperatures are attributed to southward spread of the STF across the submarine Chatham Rise, along which the STF appears to have been dynamically positioned throughout most of the last 1 Myr. For much of the last 1 Myr, glacial SST estimates in the north were only 1-2 °C cooler than the present interglacial, except in MIS16, MIS8, MIS6, and MIS4-2 when estimates are 4-7 °C cooler. These cooler temperatures are attributed to jetting of SAW through the Mernoo Saddle (across the Chatham Rise) and/or waning of the STW current. To the south, glacial SST estimates were consistently 10-11 °C cooler than present, similar to temperatures and faunas currently found in the vicinity of the Polar Front. One interpretation is that these cold temperatures reflect thermocline changes and increased Circumpolar Surface Water spinning off the Subantarctic Front as an enhanced Bounty Gyre along the south side of the Chatham Rise. For most of the last 1 Myr, the temperature gradient across the STF has been considerably greater than the present 4 °C. During glacial episodes, the STF in this region did not migrate northwards, but instead there was an intensification of the temperature gradient across it (interglacials 4-11 °C; glacials 8-14 °C).
Resumo:
During the last twenty years (1995-2015), the world of commerce has expanded beyond the traditional brick-and-mortar high street to a global shop front accessible to billions of users via the Worldwide Web (WWW). Consumers are now using the web to immerse themselves in virtual shop fronts, using Social Media (SM) to communicate and share product ideas with friends and family. Retail organisations recognise the need to develop and adapt their strategies to respond to the increasing use of SM. New goals must be set in order to identify how companies will integrate social media into current practices. This research aims to suggest an advisable and comprehensive SM strategy for companies operating in the global retail sector, based on an exploratory analysis of three multi-national retail organisations' existing SM strategies. This will be assessed in conjunction with a broader investigation into social media in the retail industry. From this, a strategy will be devised to improve internal and external communication as well as knowledge management through the use of social media. Findings suggest that the use of SM within the retail industry has dramatically improved collaboration and communication processes for organisations as they are now able to converse better with stakeholders and the tools are relatively simple to integrate and implement as they benefit one another.
Resumo:
This paper presents a numerical study of a linear compressor cascade to investigate the effective end wall profiling rules for highly-loaded axial compressors. The first step in the research applies a correlation analysis for the different flow field parameters by a data mining over 600 profiling samples to quantify how variations of loss, secondary flow and passage vortex interact with each other under the influence of a profiled end wall. The result identifies the dominant role of corner separation for control of total pressure loss, providing a principle that only in the flow field with serious corner separation does the does the profiled end wall change total pressure loss, secondary flow and passage vortex in the same direction. Then in the second step, a multi-objective optimization of a profiled end wall is performed to reduce loss at design point and near stall point. The development of effective end wall profiling rules is based on the manner of secondary flow control rather than the geometry features of the end wall. Using the optimum end wall cases from the Pareto front, a quantitative tool for analyzing secondary flow control is employed. The driving force induced by a profiled end wall on different regions of end wall flow are subjected to a detailed analysis and identified for their positive/negative influences in relieving corner separation, from which the effective profiling rules are further confirmed. It is found that the profiling rules on a cascade show distinct differences at design point and near stall point, thus loss control of different operating points is generally independent.
Resumo:
This template covers the final sections of a thesis - appendices, glossary, list of references and bibliography. Support materials for using the template are referenced near the start of the file. You will want to use this in conjunction with the Front Matter http://www.edshare.soton.ac.uk/9405/ and Chapter http://www.edshare.soton.ac.uk/9403/ templates.
Resumo:
SQL Injection Attack (SQLIA) remains a technique used by a computer network intruder to pilfer an organisation’s confidential data. This is done by an intruder re-crafting web form’s input and query strings used in web requests with malicious intent to compromise the security of an organisation’s confidential data stored at the back-end database. The database is the most valuable data source, and thus, intruders are unrelenting in constantly evolving new techniques to bypass the signature’s solutions currently provided in Web Application Firewalls (WAF) to mitigate SQLIA. There is therefore a need for an automated scalable methodology in the pre-processing of SQLIA features fit for a supervised learning model. However, obtaining a ready-made scalable dataset that is feature engineered with numerical attributes dataset items to train Artificial Neural Network (ANN) and Machine Leaning (ML) models is a known issue in applying artificial intelligence to effectively address ever evolving novel SQLIA signatures. This proposed approach applies numerical attributes encoding ontology to encode features (both legitimate web requests and SQLIA) to numerical data items as to extract scalable dataset for input to a supervised learning model in moving towards a ML SQLIA detection and prevention model. In numerical attributes encoding of features, the proposed model explores a hybrid of static and dynamic pattern matching by implementing a Non-Deterministic Finite Automaton (NFA). This combined with proxy and SQL parser Application Programming Interface (API) to intercept and parse web requests in transition to the back-end database. In developing a solution to address SQLIA, this model allows processed web requests at the proxy deemed to contain injected query string to be excluded from reaching the target back-end database. This paper is intended for evaluating the performance metrics of a dataset obtained by numerical encoding of features ontology in Microsoft Azure Machine Learning (MAML) studio using Two-Class Support Vector Machines (TCSVM) binary classifier. This methodology then forms the subject of the empirical evaluation.
Resumo:
With its powerful search engines and billions of published pages, the Worldwide Web has become the ultimate tool to explore the human experience. But, despite the advent of the digital revolution, e-books, at their core, have remained remarkably similar to their printed siblings. This has resulted in a clear dichotomy between two ways of reading: on one side, the multi-dimensional world of the Web; on the other, the linearity of books and e-books. My investigation of the literature indicates that the focus of attempts to merge these two modes of production, and hence of reading, has been the insertion of interactivity into fiction. As I will show in the Literature Review, a clear thrust of research since the early 1990s, and in my opinion the most significant, has concentrated on presenting the reader with choices that affect the plot. This has resulted in interactive stories in which the structure of the narrative can be altered by the reader of experimental fiction. The interest in this area of research is not surprising, as the interaction of readers with the fabric of the narrative provides a fertile ground for exploring, analysing, and discussing issues of plot consistency and continuity. I found in the literature several papers concerned with the effects of hyperlinking on literature, but none about how hyperlinked material and narrative could be integrated without compromising the narrative flow as designed by the author. It led me to think that the researchers had accepted hypertextuality and the linear organisation of fiction as being antithetical, thereby ignoring the possibility of exploiting the first while preserving the second. All the works I consulted were focussed on exploring the possibilities provided to authors (and readers) by hypertext or how hypertext literature affects literary criticism. This was true in earlier works by Landow and Harpold and remained true in later works by Bolter and Grusin. To quote another example, in his book Hypertext 3.0, Landow states: “Most who have speculated on the relation between hypertextuality and fiction concentrate [...] on the effects it will have on linear narrative”, and “hypertext opens major questions about story and plot by apparently doing away with linear organization” (Landow, 2006, pp. 220, 221). In other words, the authors have added narrative elements to Web pages, effectively placing their stories in a subordinate role. By focussing on “opening up” the plots, the researchers have missed the opportunity to maintain the integrity of their stories and use hyperlinked information to provide interactive access to backstory and factual bases. This would represent a missing link between the traditional way of reading, in which the readers have no influence on the path the author has laid out for them, and interactive narrative, in which the readers choose their way across alternatives, thereby, at least to a certain extent, creating their own path. It would be, to continue the metaphor, as if the readers could follow the main path created by the author while being able to get “sidetracked” into exploring hyperlinked material. In Hypertext 3.0, Landow refers to an “Axial structure [of hypertext] characteristic of electronic books and scholarly books with foot-and endnotes” versus a “Network structure of hypertext” (Landow, 2006, p. 70). My research aims at generalising the axial structure and extending it to fiction without losing the linearity at its core. In creative nonfiction, the introduction of places, scenes, and settings, together with characterisation, brings to life the facts without altering them; while much fiction draws on facts to provide a foundation, or narrative elements, for the work. But how can the reader distinguish between facts and representations? For example, to what extent do dialogues and perceptions present what was actually said and thought? Some authors of creative nonfiction use end-notes to provide comments and citations while minimising disruption the flow of the main text, but they are limited in scope and constrained in space. Each reader should be able to enjoy the narrative as if it were a novel but also to explore the facts at the level of detail s/he needs. For this to be possible, end-notes should provide a Web-like way of exploring in more detail what the author has already researched. My research aims to develop ways of integrating narrative prose and hyperlinked documents into a Hyperbook. Its goal is to create a new writing paradigm in which a story incorporates a gateway to detailed information. While creative nonfiction uses the techniques of fictional writing to provide reportage of actual events and fact-based fiction illuminates the affectual dimensions of what happened (e.g., Kate Grenville’s The Secret River and Hilary Mantel’s Wolf Hall), Hyperbooks go one step further and link narrative prose to the details of the events on which the narrative is based or, more in general, to information the reader might find of interest. My dissertation introduces and utilises Hyperbooks to engage in two parallel types of investigation Build knowledge about Italian WWII POWs held in Australia and present it as part of a novella in Hyperbook format. Develop a new piece of technology capable of extending the writing and reading process.
Resumo:
Dissertação de Mestrado, Ensino de Informática, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2014