18 resultados para classifier, pragmatics, information transport, symbolic logic
em CentAUR: Central Archive University of Reading - UK
Resumo:
We introduce the perspex machine which unifies projective geometry and Turing computation and results in a supra-Turing machine. We show two ways in which the perspex machine unifies symbolic and non-symbolic AI. Firstly, we describe concrete geometrical models that map perspexes onto neural networks, some of which perform only symbolic operations. Secondly, we describe an abstract continuum of perspex logics that includes both symbolic logics and a new class of continuous logics. We argue that an axiom in symbolic logic can be the conclusion of a perspex theorem. That is, the atoms of symbolic logic can be the conclusions of sub-atomic theorems. We argue that perspex space can be mapped onto the spacetime of the universe we inhabit. This allows us to discuss how a robot might be conscious, feel, and have free will in a deterministic, or semi-deterministic, universe. We ground the reality of our universe in existence. On a theistic point, we argue that preordination and free will are compatible. On a theological point, we argue that it is not heretical for us to give robots free will. Finally, we give a pragmatic warning as to the double-edged risks of creating robots that do, or alternatively do not, have free will.
Resumo:
We present a Bayesian image classification scheme for discriminating cloud, clear and sea-ice observations at high latitudes to improve identification of areas of clear-sky over ice-free ocean for SST retrieval. We validate the image classification against a manually classified dataset using Advanced Along Track Scanning Radiometer (AATSR) data. A three way classification scheme using a near-infrared textural feature improves classifier accuracy by 9.9 % over the nadir only version of the cloud clearing used in the ATSR Reprocessing for Climate (ARC) project in high latitude regions. The three way classification gives similar numbers of cloud and ice scenes misclassified as clear but significantly more clear-sky cases are correctly identified (89.9 % compared with 65 % for ARC). We also demonstrate the poetential of a Bayesian image classifier including information from the 0.6 micron channel to be used in sea-ice extent and ice surface temperature retrieval with 77.7 % of ice scenes correctly identified and an overall classifier accuracy of 96 %.
Resumo:
Metrics are often used to compare the climate impacts of emissions from various sources, sectors or nations. These are usually based on global-mean input, and so there is the potential that important information on smaller scales is lost. Assuming a non-linear dependence of the climate impact on local surface temperature change, we explore the loss of information about regional variability that results from using global-mean input in the specific case of heterogeneous changes in ozone, methane and aerosol concentrations resulting from emissions from road traffic, aviation and shipping. Results from equilibrium simulations with two general circulation models are used. An alternative metric for capturing the regional climate impacts is investigated. We find that the application of a metric that is first calculated locally and then averaged globally captures a more complete and informative signal of climate impact than one that uses global-mean input. The loss of information when heterogeneity is ignored is largest in the case of aviation. Further investigation of the spatial distribution of temperature change indicates that although the pattern of temperature response does not closely match the pattern of the forcing, the forcing pattern still influences the response pattern on a hemispheric scale. When the short-lived transport forcing is superimposed on present-day anthropogenic CO2 forcing, the heterogeneity in the temperature response to CO2 dominates. This suggests that the importance of including regional climate impacts in global metrics depends on whether small sectors are considered in isolation or as part of the overall climate change.
Resumo:
tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.
Resumo:
The ITCT-Lagrangian-2K4 (Intercontinental Transport and Chemical Transformation) experiment was conceived with an aim to quantify the effects of photochemistry and mixing on the transformation of air masses in the free troposphere away from emissions. To this end, attempts were made to intercept and sample air masses several times during their journey across the North Atlantic using four aircraft based in New Hampshire (USA), Faial (Azores) and Creil (France). This article begins by describing forecasts from two Lagrangian models that were used to direct the aircraft into target air masses. A novel technique then identifies Lagrangian matches between flight segments. Two independent searches are conducted: for Lagrangian model matches and for pairs of whole air samples with matching hydrocarbon fingerprints. The information is filtered further by searching for matching hydrocarbon samples that are linked by matching trajectories. The quality of these "coincident matches'' is assessed using temperature, humidity and tracer observations. The technique pulls out five clear Lagrangian cases covering a variety of situations and these are examined in detail. The matching trajectories and hydrocarbon fingerprints are shown, and the downwind minus upwind differences in tracers are discussed.
Resumo:
Within this paper modern techniques such as satellite image analysis and tools provided by geographic information systems (GIS.) are exploited in order to extend and improve existing techniques for mapping the spatial distribution of sediment transport processes. The processes of interest comprise mass movements such as solifluction, slope wash, dirty avalanches and rock- and boulder falls. They differ considerably in nature and therefore different approaches for the derivation of their spatial extent are required. A major challenge is addressing the differences between the comparably coarse resolution of the available satellite data (Landsat TM/ETM+, 30 in x 30 m) and the actual scale of sediment transport in this environment. A three-stepped approach has been developed which is based on the concept of Geomorphic Process Units (GPUs): parameterization, process area delineation and combination. Parameters include land cover from satellite data and digital elevation model derivatives. Process areas are identified using a hierarchical classification scheme utilizing thresholds and definition of topology. The approach has been developed for the Karkevagge in Sweden and could be successfully transferred to the Rabotsbekken catchment at Okstindan, Norway using similar input data. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
The research record on the quantification of sediment transport processes in periglacial mountain environments in Scandimvia dates back to the 1950s. A wide range of measurements is. available, especially from the Karkevagge region of northern Sweden. Within this paper satellite image analysis and tools provided by geographic information systems (GIS) are exploited in order to extend and improve this research and to complement geophysical methods. The processes of interest include mass movements such as solifluction, slope wash, dirty avalanches and rock-and boulder falls. Geomorphic process units have been derived in order to allow quantification via GIS techniques at a catchment scale. Mass movement rates based on existing Field measurements are employed in the budget calculation. In the Karkevagge catch ment. 80% of the area can be identified either as a source area for sediments or as a zone where sediments are deposited. The overall budget for the slopes beneath the rockwalls in the Karkevagge is approximately 680 t a(-1) whilst about 150 : a-1 are transported into the fluvial System.
Resumo:
Mainframes, corporate and central servers are becoming information servers. The requirement for more powerful information servers is the best opportunity to exploit the potential of parallelism. ICL recognized the opportunity of the 'knowledge spectrum' namely to convert raw data into information and then into high grade knowledge. Parallel Processing and Data Management Its response to this and to the underlying search problems was to introduce the CAFS retrieval engine. The CAFS product demonstrates that it is possible to move functionality within an established architecture, introduce a different technology mix and exploit parallelism to achieve radically new levels of performance. CAFS also demonstrates the benefit of achieving this transparently behind existing interfaces. ICL is now working with Bull and Siemens to develop the information servers of the future by exploiting new technologies as available. The objective of the joint Esprit II European Declarative System project is to develop a smoothly scalable, highly parallel computer system, EDS. EDS will in the main be an SQL server and an information server. It will support the many data-intensive applications which the companies foresee; it will also support application-intensive and logic-intensive systems.
Resumo:
Details about the parameters of kinetic systems are crucial for progress in both medical and industrial research, including drug development, clinical diagnosis and biotechnology applications. Such details must be collected by a series of kinetic experiments and investigations. The correct design of the experiment is essential to collecting data suitable for analysis, modelling and deriving the correct information. We have developed a systematic and iterative Bayesian method and sets of rules for the design of enzyme kinetic experiments. Our method selects the optimum design to collect data suitable for accurate modelling and analysis and minimises the error in the parameters estimated. The rules select features of the design such as the substrate range and the number of measurements. We show here that this method can be directly applied to the study of other important kinetic systems, including drug transport, receptor binding, microbial culture and cell transport kinetics. It is possible to reduce the errors in the estimated parameters and, most importantly, increase the efficiency and cost-effectiveness by reducing the necessary amount of experiments and data points measured. (C) 2003 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.
Resumo:
Cold pitched roofs, with their form of construction situating insulation on a horizontal ceiling, are intrinsically vulnerable to condensation. This study reports the results derived from using a simulation package (Heat, Air and Moisture modelling tool, or HAM-Tools) to investigate the risk of condensation in cold pitched roofs in housing fitted with a vapour-permeable underlay (VPU) of known characteristics. In order to visualize the effect of the VPUs on moisture transfer, several scenarios were modelled, and compared with the results from a conventional bituminous felt with high resistance (200 MNs/g, Sd = 40 m). The results indicate that ventilation is essential in the roof to reduce condensation. However, a sensitivity analysis proved that reducing the overall tightness of the ceiling and using lower-resistance VPUs would help in controlling condensation formation in the roof. To a large extent, the proposed characteristic performance of the VPU as predicted by manufacturers and some researchers may only be realistic if gaps in the ceiling are sealed completely during construction, which may be practically difficult given current construction practice.
Resumo:
We have discovered a novel approach of intrusion detection system using an intelligent data classifier based on a self organizing map (SOM). We have surveyed all other unsupervised intrusion detection methods, different alternative SOM based techniques and KDD winner IDS methods. This paper provides a robust designed and implemented intelligent data classifier technique based on a single large size (30x30) self organizing map (SOM) having the capability to detect all types of attacks given in the DARPA Archive 1999 the lowest false positive rate being 0.04 % and higher detection rate being 99.73% tested using full KDD data sets and 89.54% comparable detection rate and 0.18% lowest false positive rate tested using corrected data sets.
Resumo:
The high variability of the intensity of suprathermal electron flux in the solar wind is usually ascribed to the high variability of sources on the Sun. Here we demonstrate that a substantial amount of the variability arises from peaks in stream interaction regions, where fast wind runs into slow wind and creates a pressure ridge at the interface. Superposed epoch analysis centered on stream interfaces in 26 interaction regions previously identified in Wind data reveal a twofold increase in 250 eV flux (integrated over pitch angle). Whether the peaks result from the compression there or are solar signatures of the coronal hole boundary, to which interfaces may map, is an open question. Suggestive of the latter, some cases show a displacement between the electron and magnetic field peaks at the interface. Since solar information is transmitted to 1 AU much more quickly by suprathermal electrons compared to convected plasma signatures, the displacement may imply a shift in the coronal hole boundary through transport of open magnetic flux via interchange reconnection. If so, however, the fact that displacements occur in both directions and that the electron and field peaks in the superposed epoch analysis are nearly coincident indicate that any systematic transport expected from differential solar rotation is overwhelmed by a random pattern, possibly owing to transport across a ragged coronal hole boundary.
Resumo:
This paper describes the development of an experimental distributed fuzzy control system for heating and ventilation (HVAC) systems within a building. Each local control loop is affected by a number of local variables, as well as information from neighboring controllers. By including this additional information it is hoped that a more equal allocation of resources can be achieved.
Resumo:
A method of classifying the upper tropospheric/lower stratospheric (UTLS) jets has been developed that allows satellite and aircraft trace gas data and meteorological fields to be efficiently mapped in a jet coordinate view. A detailed characterization of multiple tropopauses accompanies the jet characterization. Jet climatologies show the well-known high altitude subtropical and lower altitude polar jets in the upper troposphere, as well as a pattern of concentric polar and subtropical jets in the Southern Hemisphere, and shifts of the primary jet to high latitudes associated with blocking ridges in Northern Hemisphere winter. The jet-coordinate view segregates air masses differently than the commonly-used equivalent latitude (EqL) coordinate throughout the lowermost stratosphere and in the upper troposphere. Mapping O3 data from the Aura Microwave Limb Sounder (MLS) satellite and the Winter Storms aircraft datasets in jet coordinates thus emphasizes different aspects of the circulation compared to an EqL-coordinate framework: the jet coordinate reorders the data geometrically, thus highlighting the strong PV, tropopause height and trace gas gradients across the subtropical jet, whereas EqL is a dynamical coordinate that may blur these spatial relationships but provides information on irreversible transport. The jet coordinate view identifies the concentration of stratospheric ozone well below the tropopause in the region poleward of and below the jet core, as well as other transport features associated with the upper tropospheric jets. Using the jet information in EqL coordinates allows us to study trace gas distributions in regions of weak versus strong jets, and demonstrates weaker transport barriers in regions with less jet influence. MLS and Atmospheric Chemistry Experiment-Fourier Transform Spectrometer trace gas fields for spring 2008 in jet coordinates show very strong, closely correlated, PV, tropopause height and trace gas gradients across the jet, and evidence of intrusions of stratospheric air below the tropopause below and poleward of the subtropical jet; these features are consistent between instruments and among multiple trace gases. Our characterization of the jets is facilitating studies that will improve our understanding of upper tropospheric trace gas evolution.
Resumo:
Previously, governments have responded to the impacts of economic failures and consequently have developed more regulations to protect employees, customers, shareholders and the economic wellbeing of the state. Our research addresses how Accounting Information Systems (AIS) may act as carriers for institutionalised practices associated with maintaining regulatory compliance within the context of UK Asset Management Houses. The AIS was found to be a strong conduit for institutionalized compliance related practices, utilising symbolic systems, relational systems, routines and artefacts to carry approaches relating to regulative, normative and cultural-cognitive strands of institutionalism. Thus, AIS are integral to the development and dissipation of best practice for the management of regulatory compliance. As institutional elements are clearly present we argue that AIS and regulatory compliance provide a rich context to further institutionalism. Since AIS may act as conduits for regulatory approaches, both systems adopters and clients may benefit from actively seeking to codify and abstract best practices into AIS. However, the application of generic institutionalized approaches, which may be applied across similar organizations, must be tempered with each firm’s business environment and associated regulatory exposure. A balance should be sought between approaches specific enough to be useful but generic enough to be universally applied.