130 resultados para Fléchier, Esprit, 1632-1710.


Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The decision of the U.S. Supreme Court in 1991 in Feist Publications, Inc. v. Rural Tel. Service Co. affirmed originality as a constitutional requirement for copyright. Originality has a specific sense and is constituted by a minimal degree of creativity and independent creation. The not original is the more developed concept within the decision. It includes the absence of a minimal degree of creativity as a major constituent. Different levels of absence of creativity also are distinguished, from the extreme absence of creativity to insufficient creativity. There is a gestalt effect of analogy between the delineation of the not original and the concept of computability. More specific correlations can be found within the extreme absence of creativity. "[S]o mechanical" in the decision can be correlated with an automatic mechanical procedure and clauses with a historical resonance with understandings of computability as what would naturally be regarded as computable. The routine within the extreme absence of creativity can be regarded as the product of a computational process. The concern of this article is with rigorously establishing an understanding of the extreme absence of creativity, primarily through the correlations with aspects of computability. The understanding established is consistent with the other elements of the not original. It also revealed as testable under real-world conditions. The possibilities for understanding insufficient creativity, a minimal degree of creativity, and originality, from the understanding developed of the extreme absence of creativity, are indicated. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mechanism of energy converting NADH:ubiquinone oxidoreductase (complex 1) is Still unknown. A current controversy centers around the question whether electron transport of complex I is always linked to vectorial proton translocation or whether in some organisms the enzyme pumps sodium ions instead. To develop better experimental tools to elucidate its mechanism, we have reconstituted the affinity purified enzyme into proteoliposomes and monitored the generation of Delta pH and Delta psi. We tested several detergents to solubilize the asolectin used for liposome formation. Tightly coupled proteoliposomes containing highly active complex I were obtained by detergent removal with BioBeads after total solubilization or the phospholipids with n-octyl-beta-D-glucopyranoside. We have used dyes to monitor the formation of the two components of the proton motive force, Delta pH and Delta psi, across the liposomal membrane, and analyzed the effects of inhibitors, uncouplers and ionophores on this process. We show that electron transfer of complex I of the lower eukaryote Y. lipolytica is clearly linked to proton translocation. While this study was not specifically designed to demonstrate possible additional sodium translocating properties of complex 1, we did not find indications for primary or secondary Na+ translocation by Y lipolytica complex I. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La3FMo4O16 crystallizes in the triclinic crystal system with space group P (1) over bar [a = 724.86(2) pm, b = 742.26(2) pm, c = 1469.59(3) pm, a = 101.683(2)degrees, beta 102.118(2)degrees, gamma = 100.279(2)degrees] with two formula units per unit cell. The three crystallographically independent La3+ cations show a coordination number of nine each, with one F- and eight O2- anions forming distorted monocapped square antiprisms. The fluoride anion is coordinated by all three lanthanum cations to form a nearly planar triangle. Besides three crystallographically independent tetrahedral [MoO4](2-) units, a fourth one with a higher coordination number (CN = 4 +1) can be found in the crystal structure, forming a dimeric entity with a formula of [Mo2O8](4-) consisting of two edge-connected square pyramids. Several spectroscopic measurements were performed on the title compound, such as infrared, Raman, and diffuse reflectance spectroscopy. Furthermore, La3FMo4O16 was investigated for its capacity to work as host material for doping with luminescent active cations, such as Ce3+ or Pr3+. Therefore, luminescence spectroscopic as well as EPR measurements were performed with doped samples of the title compound. Both the pure and the doped compounds can be synthesized by fusing La2O3, LaF3 and MoO3 (ratio 4:1:12; ca. 1 % CeF3 and PrF3 as dopant, respectively) in evacuated silica ampoules at 850 degrees C for 7 d.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hunter and Konieczny explored the relationships between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base in several of their papers. In particular, an inconsistency value termed MIVC, defined from minimal inconsistent subsets, can be considered as a Shapley Inconsistency Value. Moreover, it can be axiomatized completely in terms of five simple axioms. MinInc, one of the five axioms, states that each minimal inconsistent set has the same amount of conflict. However, it conflicts with the intuition illustrated by the lottery paradox, which states that as the size of a minimal inconsistent belief base increases, the degree of inconsistency of that belief base becomes smaller. To address this, we present two kinds of revised inconsistency measures for a belief base from its minimal inconsistent subsets. Each of these measures considers the size of each minimal inconsistent subset as well as the number of minimal inconsistent subsets of a belief base. More specifically, we first present a vectorial measure to capture the inconsistency for a belief base, which is more discriminative than MIVC. Then we present a family of weighted inconsistency measures based on the vectorial inconsistency measure, which allow us to capture the inconsistency for a belief base in terms of a single numerical value as usual. We also show that each of the two kinds of revised inconsistency measures can be considered as a particular Shapley Inconsistency Value, and can be axiomatically characterized by the corresponding revised axioms presented in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the last decade, data mining has emerged as one of the most dynamic and lively areas in information technology. Although many algorithms and techniques for data mining have been proposed, they either focus on domain independent techniques or on very specific domain problems. A general requirement in bridging the gap between academia and business is to cater to general domain-related issues surrounding real-life applications, such as constraints, organizational factors, domain expert knowledge, domain adaption, and operational knowledge. Unfortunately, these either have not been addressed, or have not been sufficiently addressed, in current data mining research and development.Domain-Driven Data Mining (D3M) aims to develop general principles, methodologies, and techniques for modeling and merging comprehensive domain-related factors and synthesized ubiquitous intelligence surrounding problem domains with the data mining process, and discovering knowledge to support business decision-making. This paper aims to report original, cutting-edge, and state-of-the-art progress in D3M. It covers theoretical and applied contributions aiming to: 1) propose next-generation data mining frameworks and processes for actionable knowledge discovery, 2) investigate effective (automated, human and machine-centered and/or human-machined-co-operated) principles and approaches for acquiring, representing, modelling, and engaging ubiquitous intelligence in real-world data mining, and 3) develop workable and operational systems balancing technical significance and applications concerns, and converting and delivering actionable knowledge into operational applications rules to seamlessly engage application processes and systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Developing a desirable framework for handling inconsistencies in software requirements specifications is a challenging problem. It has been widely recognized that the relative priority of requirements can help developers to make some necessary trade-off decisions for resolving con- flicts. However, for most distributed development such as viewpoints-based approaches, different stakeholders may assign different levels of priority to the same shared requirements statement from their own perspectives. The disagreement in the local levels of priority assigned to the same shared requirements statement often puts developers into a dilemma during the inconsistency handling process. The main contribution of this paper is to present a prioritized merging-based framework for handling inconsistency in distributed software requirements specifications. Given a set of distributed inconsistent requirements collections with the local prioritization, we first construct a requirements specification with a prioritization from an overall perspective. We provide two approaches to constructing a requirements specification with the global prioritization, including a merging-based construction and a priority vector-based construction. Following this, we derive proposals for handling inconsistencies from the globally prioritized requirements specification in terms of prioritized merging. Moreover, from the overall perspective, these proposals may be viewed as the most appropriate to modifying the given inconsistent requirements specification in the sense of the ordering relation over all the consistent subsets of the requirements specification. Finally, we consider applying negotiation-based techniques to viewpoints so as to identify an acceptable common proposal from these proposals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hardware synthesis from dataflow graphs of signal processing systems is a growing research area as focus shifts to high level design methodologies. For data intensive systems, dataflow based synthesis can lead to an inefficient usage of memory due to the restrictive nature of synchronous dataflow and its inability to easily model data reuse. This paper explores how dataflow graph changes can be used to drive both the on-chip and off-chip memory organisation and how these memory architectures can be mapped to a hardware implementation. By exploiting the data reuse inherent to many image processing algorithms and by creating memory hierarchies, off-chip memory bandwidth can be reduced by a factor of a thousand from the original dataflow graph level specification of a motion estimation algorithm, with a minimal increase in memory size. This analysis is verified using results gathered from implementation of the motion estimation algorithm on a Xilinx Virtex-4 FPGA, where the delay between the memories and processing elements drops from 14.2 ns down to 1.878 ns through the refinement of the memory architecture. Care must be taken when modeling these algorithms however, as inefficiencies in these models can be easily translated into overuse of hardware resources.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, a novel video-based multimodal biometric verification scheme using the subspace-based low-level feature fusion of face and speech is developed for specific speaker recognition for perceptual human--computer interaction (HCI). In the proposed scheme, human face is tracked and face pose is estimated to weight the detected facelike regions in successive frames, where ill-posed faces and false-positive detections are assigned with lower credit to enhance the accuracy. In the audio modality, mel-frequency cepstral coefficients are extracted for voice-based biometric verification. In the fusion step, features from both modalities are projected into nonlinear Laplacian Eigenmap subspace for multimodal speaker recognition and combined at low level. The proposed approach is tested on the video database of ten human subjects, and the results show that the proposed scheme can attain better accuracy in comparison with the conventional multimodal fusion using latent semantic analysis as well as the single-modality verifications. The experiment on MATLAB shows the potential of the proposed scheme to attain the real-time performance for perceptual HCI applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper presents a simple game-theoretic model of two Internet service providers (ISPs), drawn from a larger set consisting of Tiers-1 and -2 ISPs, who choose between peering and transit agreements. The study focuses on the costs of interconnection taking into account traffic imbalances. The analysis suggests that if the traffic flows and the costs of interconnection are fairly shared, the provider's peer, otherwise they choose transit. Moreover, the joint profits are maximized under the transit arrangement. Published by Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider homogeneous two-sided markets, in which connected buyer-seller pairs bargain and trade repeatedly. In this infinite market game with exogenous matching probabilities and a common discount factor, we prove the existence of equilibria in stationary strategies. The equilibrium payoffs are given implicitly as a solution to a system of linear equations. Then, we endogenize the matching mechanism in a link formation stage that precedes the market game. When agents are sufficiently patient and link costs are low, we provide an algorithm to construct minimally connected networks that are pairwise stable with respect to the expected payoffs in the trading stage. The constructed networks are essentially efficient and consist of components with a constant buyer-seller ratio. The latter ratio increases (decreases) for a buyer (seller) that deletes one of her links in a pairwise stable component.