128 resultados para artificial intelligence


Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is becoming clear that, contrary to earlier expectations, the application of AI techniques to law is not as easy nor as effective as some claimed. Unfortunately, for most AI researchers, there seems to be little understanding of just why this is. In this paper I argue, from empirical study of lawyers in action, just why there is a mismatch between the AI view of law, and law in practice. While this is important and novel, it also - if my arguments are accepted - demonstrates just why AI will never have success in producing the computerised lawyer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the literature, politeness has been researched within many disciplines. Although Brown and Levinson’s theory of politeness (1978, 1987) is often cited, it is primarily a linguistic theory and has been criticized for its lack of generalizability to all cultures. Consequently, there is a need for a more comprehensive approach to understand and explain politeness. We suggest applying a social signal framework that considers politeness as a communicative state. By doing so, we aim to unify and explain politeness and its corresponding research and identify further research needed in this area.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional static analysis fails to auto-parallelize programs with a complex control and data flow. Furthermore, thread-level parallelism in such programs is often restricted to pipeline parallelism, which can be hard to discover by a programmer. In this paper we propose a tool that, based on profiling information, helps the programmer to discover parallelism. The programmer hand-picks the code transformations from among the proposed candidates which are then applied by automatic code transformation techniques.

This paper contributes to the literature by presenting a profiling tool for discovering thread-level parallelism. We track dependencies at the whole-data structure level rather than at the element level or byte level in order to limit the profiling overhead. We perform a thorough analysis of the needs and costs of this technique. Furthermore, we present and validate the belief that programs with complex control and data flow contain significant amounts of exploitable coarse-grain pipeline parallelism in the program’s outer loops. This observation validates our approach to whole-data structure dependencies. As state-of-the-art compilers focus on loops iterating over data structure members, this observation also explains why our approach finds coarse-grain pipeline parallelism in cases that have remained out of reach for state-of-the-art compilers. In cases where traditional compilation techniques do find parallelism, our approach allows to discover higher degrees of parallelism, allowing a 40% speedup over traditional compilation techniques. Moreover, we demonstrate real speedups on multiple hardware platforms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As a promising method for pattern recognition and function estimation, least squares support vector machines (LS-SVM) express the training in terms of solving a linear system instead of a quadratic programming problem as for conventional support vector machines (SVM). In this paper, by using the information provided by the equality constraint, we transform the minimization problem with a single equality constraint in LS-SVM into an unconstrained minimization problem, then propose reduced formulations for LS-SVM. By introducing this transformation, the times of using conjugate gradient (CG) method, which is a greatly time-consuming step in obtaining the numerical solution, are reduced to one instead of two as proposed by Suykens et al. (1999). The comparison on computational speed of our method with the CG method proposed by Suykens et al. and the first order and second order SMO methods on several benchmark data sets shows a reduction of training time by up to 44%. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The majority of reported learning methods for Takagi-Sugeno-Kang fuzzy neural models to date mainly focus on the improvement of their accuracy. However, one of the key design requirements in building an interpretable fuzzy model is that each obtained rule consequent must match well with the system local behaviour when all the rules are aggregated to produce the overall system output. This is one of the distinctive characteristics from black-box models such as neural networks. Therefore, how to find a desirable set of fuzzy partitions and, hence, to identify the corresponding consequent models which can be directly explained in terms of system behaviour presents a critical step in fuzzy neural modelling. In this paper, a new learning approach considering both nonlinear parameters in the rule premises and linear parameters in the rule consequents is proposed. Unlike the conventional two-stage optimization procedure widely practised in the field where the two sets of parameters are optimized separately, the consequent parameters are transformed into a dependent set on the premise parameters, thereby enabling the introduction of a new integrated gradient descent learning approach. A new Jacobian matrix is thus proposed and efficiently computed to achieve a more accurate approximation of the cost function by using the second-order Levenberg-Marquardt optimization method. Several other interpretability issues about the fuzzy neural model are also discussed and integrated into this new learning approach. Numerical examples are presented to illustrate the resultant structure of the fuzzy neural models and the effectiveness of the proposed new algorithm, and compared with the results from some well-known methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The concept of exospace, as an alternative liveable structure, is discussed in this article to improve our comprehension of architectural space. Exospace is a man-made space designed for living beyond Earth’s atmosphere. Humankind has developed outerspace technologies to build the International Space Station as a significant experiment in exospace design. The ISS is a new building type for scientific experiments and for testing human existence in outerspace.

A fictional example of exospace, on the other hand, is Discovery 1 spaceship in Stanley Kubrick’s legendary science fiction film 2001: A Space Odyssey (1968). It is a ship travelling to Jupiter with a crew of five astronauts and HAL9000, the artificial intelligence controlling the ship. I will first discuss the ISS, and the space stations built before, from a spatial point of view. A spatial study of Discovery 1 will follow. Finally, through an understanding of exospace, I will return to architectural space with a critical appraisal. The comparison of architectural space with exospace will add to the discussion of space theories from a technological approach.

Exospace creates an alternative reality to architectural space. Architects cannot consider exospaces without comparing them with the spaces they design on Earth. The different context of outerspace shows that a work of terrestrial architecture is very much dependent on its context. A building is not an ‘object’ that can be located anywhere; it is designed for its site. Architectural space is a real, material, continuous, static and extroverted habitable space designed for and used in the specific physical context of Earth. The existence of exospace in science opens a new discussion in architectural theory, both terrestrial and extraterrestrial.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With a significant increment of the number of digital cameras used for various purposes, there is a demanding call for advanced video analysis techniques that can be used to systematically interpret and understand the semantics of video contents, which have been recorded in security surveillance, intelligent transportation, health care, video retrieving and summarization. Understanding and interpreting human behaviours based on video analysis have observed competitive challenges due to non-rigid human motion, self and mutual occlusions, and changes of lighting conditions. To solve these problems, advanced image and signal processing technologies such as neural network, fuzzy logic, probabilistic estimation theory and statistical learning have been overwhelmingly investigated.