76 resultados para Unified Lending
Resumo:
Southern Tiwa (Tanoan) exhibits agreement with up to three arguments (ergative, absolutive, dative). This agreement is subject to certain restrictions resembling the Person-Case Constraint paradigm (Bonet 1991). Moreover, there is a correlation between agreement restrictions and conditions on (the obviation of) noun-incorporation in Southern Tiwa, as explicitly and elegantly captured by Rosen (1990) in terms of a heterogeneous feature hierarchy and rules of association. We attempt to recast Rosen’s central insights in terms of Anagnostopoulou’s probe-sharing model of Person-Case Constraint effects (Anagnostopoulou 2003, 2006), to show that the full range of Southern Tiwa agreement and (non-)incorporation restrictions can be given a single, unified analysis within the probe-goal-Agree framework of Chomsky (2001). In particular, we argue that Southern Tiwa’s triple-agreement system is characterized by (a) an independent class probe located on the heads T and v, and (b) a rule that allows this class probe to be deleted in the context of local-person T-agreement. The various restrictions on agreement and non-incorporation then reduce to a single source: failure of class-valuation with DP (as opposed to NP) arguments.
Resumo:
This paper investigates the distribution of the condition number of complex Wishart matrices. Two closely related measures are considered: the standard condition number (SCN) and the Demmel condition number (DCN), both of which have important applications in the context of multiple-input multipleoutput (MIMO) communication systems, as well as in various branches of mathematics. We first present a novel generic framework for the SCN distribution which accounts for both central and non-central Wishart matrices of arbitrary dimension. This result is a simple unified expression which involves only a single scalar integral, and therefore allows for fast and efficient computation. For the case of dual Wishart matrices, we derive new exact polynomial expressions for both the SCN and DCN distributions. We also formulate a new closed-form expression for the tail SCN distribution which applies for correlated central Wishart matrices of arbitrary dimension and demonstrates an interesting connection to the maximum eigenvalue moments of Wishart matrices of smaller dimension. Based on our analytical results, we gain valuable insights into the statistical behavior of the channel conditioning for various MIMO fading scenarios, such as uncorrelated/semi-correlated Rayleigh fading and Ricean fading. © 2010 IEEE.
Resumo:
The global ETF industry provides more complicated investment vehicles than low-cost index trackers. Instead, we find that the real investments of ETFs that do not fully replicate their benchmarks may deviate from their benchmarks to leverage informational advantages (which leads to a surprising stock-selection ability), to benefit from the securities lending market, to support ETF-affiliated banks’ stock prices, and to help affiliated OEFs through cross-trading. These effects are more prevalent in ETFs domiciled in Europe. Market awareness of such additional risk is reflected in ETF outflows. These results have important normative implications for consumer protection and financial stability.
Resumo:
The term periphery is, most of the time, used with or in relation to the centre. The ‘zoning’ taken by granted by the architect is unlike the one portrayed by the director who frames the differences of places in a non-linear manner. Film offers a constructed urban experience, suggesting the city to be a local network composed of nodes and links, rather than a centre and the margin. It is possible to talk about the construction of a new kind of network in film through a temporal representation of space, of the distant as the close. In this way, film may be a tool to shift the gaze from the bird’s-eye view to the eye level to create ‘a unified perceptual image of the city’, in Christine Boyer’s words. The experienced surface of the city is two-dimensional neither in fiction nor in reality. In this chapter, the nodes of Dublin are examined through two Irish films, Goldfish Memory (Elizabeth Gill, 2003) and Adam and Paul (Leonard Abrahamson, 2004). Specific elements of the city of Dublin, including walls, houses, pubs, streets, bridges, and parks, are analysed to understand the nature of the network of the city composed of nodes and their connections.
Resumo:
Fixed and wireless networks are increasingly converging towards common connectivity with IP-based core networks. Providing effective end-to-end resource and QoS management in such complex heterogeneous converged network scenarios requires unified, adaptive and scalable solutions to integrate and co-ordinate diverse QoS mechanisms of different access technologies with IP-based QoS. Policy-Based Network Management (PBNM) is one approach that could be employed to address this challenge. Hence, a policy-based framework for end-to-end QoS management in converged networks, CNQF (Converged Networks QoS Management Framework) has been proposed within our project. In this paper, the CNQF architecture, a Java implementation of its prototype and experimental validation of key elements are discussed. We then present a fuzzy-based CNQF resource management approach and study the performance of our implementation with real traffic flows on an experimental testbed. The results demonstrate the efficacy of our resource-adaptive approach for practical PBNM systems
Resumo:
Policy-based network management (PBNM) paradigms provide an effective tool for end-to-end resource
management in converged next generation networks by enabling unified, adaptive and scalable solutions
that integrate and co-ordinate diverse resource management mechanisms associated with heterogeneous
access technologies. In our project, a PBNM framework for end-to-end QoS management in converged
networks is being developed. The framework consists of distributed functional entities managed within a
policy-based infrastructure to provide QoS and resource management in converged networks. Within any
QoS control framework, an effective admission control scheme is essential for maintaining the QoS of
flows present in the network. Measurement based admission control (MBAC) and parameter basedadmission control (PBAC) are two commonly used approaches. This paper presents the implementationand analysis of various measurement-based admission control schemes developed within a Java-based
prototype of our policy-based framework. The evaluation is made with real traffic flows on a Linux-based experimental testbed where the current prototype is deployed. Our results show that unlike with classic MBAC or PBAC only schemes, a hybrid approach that combines both methods can simultaneously result in improved admission control and network utilization efficiency
Resumo:
This paper presents the design and implementation of a measurement-based QoS and resource management framework, CNQF (Converged Networks’ QoS Management Framework). CNQF is designed to provide unified, scalable QoS control and resource management through the use of a policy-based network
management paradigm. It achieves this via distributed functional entities that are deployed to co-ordinate the resources of the transport network through centralized policy-driven decisions supported by measurement-based control architecture. We present the CNQF architecture, implementation of the
prototype and validation of various inbuilt QoS control mechanisms using real traffic flows on a Linux-based experimental test bed.
Resumo:
Software is patentable in Europe so long as there is sufficient ‘technical contribution’ under the decades-long interpretation of the European Patent Convention made by the Boards of Appeal of the European Patent Office. Despite the failure of the proposed Directive on Computer Implemented Inventions, opponents of software patents have failed to have any affect upon this technical contrivance. Yet, while national courts find the Boards of Appeal decisions persuasive, ‘technical contribution’ remains a difficult test for these various courts to apply. In this article I outline that the test is difficult to utilise in national litigation (it is an engineering approach, rather than a legal one) and suggest that as the Boards of Appeal become less important (and thus less persuasive) should the proposed Unified Patent Court come to fruition, the ‘technical contribution’ test is unlikely to last. This may again make the whole issue of what/whether/how software should be patentable open to debate, hopefully in a less aggressive environment than has existed to date.
Resumo:
In finite difference time domain simulation of room acoustics, source functions are subject to various constraints. These depend on the way sources are injected into the grid and on the chosen parameters of the numerical scheme being used. This paper addresses the issue of selecting and designing sources for finite difference simulation, by first reviewing associated aims and constraints, and evaluating existing source models against these criteria. The process of exciting a model is generalized by introducing a system of three cascaded filters, respectively, characterizing the driving pulse, the source mechanics, and the injection of the resulting source function into the grid. It is shown that hard, soft, and transparent sources can be seen as special cases within this unified approach. Starting from the mechanics of a small pulsating sphere, a parametric source model is formulated by specifying suitable filters. This physically constrained source model is numerically consistent, does not scatter incoming waves, and is free from zero- and low-frequency artifacts. Simulation results are employed for comparison with existing source formulations in terms of meeting the spectral and temporal requirements on the outward propagating wave.
Resumo:
The time period bridging the years 2007 to 2012 will be remembered as one characterised by dramatic changes in the Irish and UK construction industries. Construction companies witnessed unprecedented changes in the environment, namely the coincidence of a sharp economic downturn, the significant decline of public works, a reduction in lending, increased competition, and structural changes in the marketplace. Nevertheless, little has been documented on what response strategies construction companies adopt as a result of an economic recession. Based on four exploratory case studies, a taxonomy framework of the response strategies adopted by Irish and UK construction companies during the 2007 economic recession was developed relative to Porter’s (1980) generic strategies of cost leadership, differentiation, and focus. Porter’s model (1980) is a well known theoretical framework among business strategists and industrial economists worldwide. The analysis provides strong support for the adoption of cost leadership strategies as a means to surviving the 2007 economic recession. The case studies further suggest that cost control initiatives are one of the most important attributes in companies’ responses to the 2007 recession. The findings provide valuable assistance for construction contractors in developing effective strategies and thus reducing business failures during recessionary periods.
Resumo:
A reduced-density-operator description is developed for coherent optical phenomena in many-electron atomic systems, utilizing a Liouville-space, multiple-mode Floquet–Fourier representation. The Liouville-space formulation provides a natural generalization of the ordinary Hilbert-space (Hamiltonian) R-matrix-Floquet method, which has been developed for multi-photon transitions and laser-assisted electron–atom collision processes. In these applications, the R-matrix-Floquet method has been demonstrated to be capable of providing an accurate representation of the complex, multi-level structure of many-electron atomic systems in bound, continuum, and autoionizing states. The ordinary Hilbert-space (Hamiltonian) formulation of the R-matrix-Floquet method has been implemented in highly developed computer programs, which can provide a non-perturbative treatment of the interaction of a classical, multiple-mode electromagnetic field with a quantum system. This quantum system may correspond to a many-electron, bound atomic system and a single continuum electron. However, including pseudo-states in the expansion of the many-electron atomic wave function can provide a representation of multiple continuum electrons. The 'dressed' many-electron atomic states thereby obtained can be used in a realistic non-perturbative evaluation of the transition probabilities for an extensive class of atomic collision and radiation processes in the presence of intense electromagnetic fields. In order to incorporate environmental relaxation and decoherence phenomena, we propose to utilize the ordinary Hilbert-space (Hamiltonian) R-matrix-Floquet method as a starting-point for a Liouville-space (reduced-density-operator) formulation. To illustrate how the Liouville-space R-matrix-Floquet formulation can be implemented for coherent atomic radiative processes, we discuss applications to electromagnetically induced transparency, as well as to related pump–probe optical phenomena, and also to the unified description of radiative and dielectronic recombination in electron–ion beam interactions and high-temperature plasmas.
Resumo:
We introduce a general scheme for sequential one-way quantum computation where static systems with long-living quantum coherence (memories) interact with moving systems that may possess very short coherence times. Both the generation of the cluster state needed for the computation and its consumption by measurements are carried out simultaneously. As a consequence, effective clusters of one spatial dimension fewer than in the standard approach are sufficient for computation. In particular, universal computation requires only a one-dimensional array of memories. The scheme applies to discrete-variable systems of any dimension as well as to continuous-variable ones, and both are treated equivalently under the light of local complementation of graphs. In this way our formalism introduces a general framework that encompasses and generalizes in a unified manner some previous system-dependent proposals. The procedure is intrinsically well suited for implementations with atom-photon interfaces.
Resumo:
This research investigates the relationship between elevated trace elements in soils, stream sediments and stream water and the prevalence of Chronic Kidney Disease (CKD). The study uses a collaboration of datasets provided from the UK Renal Registry Report (UKRR) on patients with renal diseases requiring treatment including Renal Replacement Therapy (RRT), the soil geochemical dataset for Northern Ireland provided by the Tellus Survey, Geological Survey of Northern Ireland (GSNI) and the bioaccessibility of Potentially Toxic Elements (PTEs) from soil samples which were obtained from the Unified Barge Method (UBM). The relationship between these factors derives from the UKRR report which highlights incidence rates of renal impaired patients showing regional variation with cases of unknown aetiology. Studies suggest a potential cause of the large variation and uncertain aetiology is associated with underlying environmental factors such as the oral bioaccessibility of trace elements in the gastrointestinal tract.
As previous research indicates that long term exposure is related to environmental factors, Northern Ireland is ideally placed for this research as people traditionally live in the same location for long periods of time. Exploratory data analysis and multivariate analyses are used to examine the soil, stream sediments and stream water geochemistry data for a range of key elements including arsenic, lead, cadmium and mercury identified from a review of previous renal disease literature. The spatial prevalence of patients with long term CKD is analysed on an area basis. Further work includes cluster analysis to detect areas of low or high incidences of CKD that are significantly correlated in space, Geographical Weighted Regression (GWR) and Poisson kriging to examine locally varying relationship between elevated concentrations of PTEs and the prevalence of CKD.
Resumo:
Lead (Pb) is a non-threshold toxin capable of inducing toxic effects at any blood level but availability of soil screening criteria for assessing potential health risks is limited. The oral bioaccessibility of Pb in 163 soil samples was attributed to sources through solubility estimation and domain identification. Samples were extracted following the Unified BARGE Method. Urban, mineralisation, peat and granite domains accounted for elevated Pb concentrations compared to rural samples. High Pb solubility explained moderate-high gastric (G) bioaccessible fractions throughout the study area. Higher maximum G concentrations were measured in urban (97.6 mg kg−1) and mineralisation (199.8 mg kg−1) domains. Higher average G concentrations occurred in mineralisation (36.4 mg kg−1) and granite (36.0 mg kg−1) domains. Findings suggest diffuse anthropogenic and widespread geogenic contamination could be capable of presenting health risks, having implications for land management decisions in jurisdictions where guidance advises these forms of pollution should not be regarded as contaminated land.
Resumo:
In this paper, we present a unified approach to an energy-efficient variation-tolerant design of Discrete Wavelet Transform (DWT) in the context of image processing applications. It is to be noted that it is not necessary to produce exactly correct numerical outputs in most image processing applications. We exploit this important feature and propose a design methodology for DWT which shows energy quality tradeoffs at each level of design hierarchy starting from the algorithm level down to the architecture and circuit levels by taking advantage of the limited perceptual ability of the Human Visual System. A unique feature of this design methodology is that it guarantees robustness under process variability and facilitates aggressive voltage over-scaling. Simulation results show significant energy savings (74% - 83%) with minor degradations in output image quality and avert catastrophic failures under process variations compared to a conventional design. © 2010 IEEE.