808 resultados para Data portal performance


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we evaluate the performance of our earlier proposed enhanced relay-enabled distributed coordination function (ErDCF) for wireless ad hoc networks. The idea of ErDCF is to use high data rate nodes to work as relays for the low data rate nodes. ErDCF achieves higher throughput and reduced energy consumption compared to IEEE 802.11 distributed coordination function (DCF). This is a result of. 1) using relay which helps to increase the throughput and lower overall blocking time of nodes due to faster dual-hop transmission, 2) using dynamic preamble (i.e. using short preamble for the relay transmission) which further increases the throughput and lower overall blocking time and also by 3) reducing unnecessary overhearing (by other nodes not involved in transmission). We evaluate the throughput and energy performance of the ErDCF with different rate combinations. ErDCF (11,11) (ie. R1=R2=11 Mbps) yields a throughput improvement of 92.9% (at the packet length of 1000 bytes) and an energy saving of 72.2% at 50 nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The iRODS system, created by the San Diego Supercomputing Centre, is a rule oriented data management system that allows the user to create sets of rules to define how the data is to be managed. Each rule corresponds to a particular action or operation (such as checksumming a file) and the system is flexible enough to allow the user to create new rules for new types of operations. The iRODS system can interface to any storage system (provided an iRODS driver is built for that system) and relies on its’ metadata catalogue to provide a virtual file-system that can handle files of any size and type. However, some storage systems (such as tape systems) do not handle small files efficiently and prefer small files to be packaged up (or “bundled”) into larger units. We have developed a system that can bundle small data files of any type into larger units - mounted collections. The system can create collection families and contains its’ own extensible metadata, including metadata on which family the collection belongs to. The mounted collection system can work standalone and is being incorporated into the iRODS system to enhance the systems flexibility to handle small files. In this paper we describe the motivation for creating a mounted collection system, its’ architecture and how it has been incorporated into the iRODS system. We describe different technologies used to create the mounted collection system and provide some performance numbers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This correspondence proposes a new algorithm for the OFDM joint data detection and phase noise (PHN) cancellation for constant modulus modulations. We highlight that it is important to address the overfitting problem since this is a major detrimental factor impairing the joint detection process. In order to attack the overfitting problem we propose an iterative approach based on minimum mean square prediction error (MMSPE) subject to the constraint that the estimated data symbols have constant power. The proposed constrained MMSPE algorithm (C-MMSPE) significantly improves the performance of existing approaches with little extra complexity being imposed. Simulation results are also given to verify the proposed algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many kernel classifier construction algorithms adopt classification accuracy as performance metrics in model evaluation. Moreover, equal weighting is often applied to each data sample in parameter estimation. These modeling practices often become problematic if the data sets are imbalanced. We present a kernel classifier construction algorithm using orthogonal forward selection (OFS) in order to optimize the model generalization for imbalanced two-class data sets. This kernel classifier identification algorithm is based on a new regularized orthogonal weighted least squares (ROWLS) estimator and the model selection criterion of maximal leave-one-out area under curve (LOO-AUC) of the receiver operating characteristics (ROCs). It is shown that, owing to the orthogonalization procedure, the LOO-AUC can be calculated via an analytic formula based on the new regularized orthogonal weighted least squares parameter estimator, without actually splitting the estimation data set. The proposed algorithm can achieve minimal computational expense via a set of forward recursive updating formula in searching model terms with maximal incremental LOO-AUC value. Numerical examples are used to demonstrate the efficacy of the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The general packet radio service (GPRS) has been developed to allow packet data to be transported efficiently over an existing circuit-switched radio network, such as GSM. The main application of GPRS are in transporting Internet protocol (IP) datagrams from web servers (for telemetry or for mobile Internet browsers). Four GPRS baseband coding schemes are defined to offer a trade-off in requested data rates versus propagation channel conditions. However, data rates in the order of > 100 kbits/s are only achievable if the simplest coding scheme is used (CS-4) which offers little error detection and correction (EDC) (requiring excellent SNR) and the receiver hardware is capable of full duplex which is not currently available in the consumer market. A simple EDC scheme to improve the GPRS block error rate (BLER) performance is presented, particularly for CS-4, however gains in other coding schemes are seen. For every GPRS radio block that is corrected by the EDC scheme, the block does not need to be retransmitted releasing bandwidth in the channel and improving the user's application data rate. As GPRS requires intensive processing in the baseband, a viable field programmable gate array (FPGA) solution is presented in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The General Packet Radio Service (GPRS) was developed to allow packet data to be transported efficiently over an existing circuit switched radio network. The main applications for GPRS are in transporting IP datagram’s from the user’s mobile Internet browser to and from the Internet, or in telemetry equipment. A simple Error Detection and Correction (EDC) scheme to improve the GPRS Block Error Rate (BLER) performance is presented, particularly for coding scheme 4 (CS-4), however gains in other coding schemes are seen. For every GPRS radio block that is corrected by the EDC scheme, the block does not need to be retransmitted releasing bandwidth in the channel, improving throughput and the user’s application data rate. As GPRS requires intensive processing in the baseband, a viable hardware solution for a GPRS BLER co-processor is discussed that has been currently implemented in a Field Programmable Gate Array (FPGA) and presented in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the extent to which clients were able to influence performance measurement appraisals during the downturn in commercial property markets that began in the UK during the second half of 2007. The sharp change in market sentiment produced speculation that different client categories were attempting to influence their appraisers in different ways. In particular, it was recognised that the requirement for open‐ended funds to meet redemptions gave them strong incentives to ensure that their asset values were marked down to market. Using data supplied by Investment Property Databank, we demonstrate that, indeed, unlisted open‐ended funds experienced sharper drops in capital values than other fund types in the last quarter of 2007, after the market turning point and at the time when redemptions were at their highest. These differences are statistically significant and cannot simply be explained by differences in portfolio composition. Client influence on appraisal forms one possible explanation of the results observed: the different pressures on fund managers resulting in different appraisal outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Autism spectrum disorder (ASD) was once considered to be highly associated with intellectual disability and to show a characteristic IQ profile, with strengths in performance over verbal abilities and a distinctive pattern of ‘peaks’ and ‘troughs’ at the subtest level. However, there are few data from epidemiological studies. Method Comprehensive clinical assessments were conducted with 156 children aged 10–14 years [mean (s.d.)=11.7 (0.9)], seen as part of an epidemiological study (81 childhood autism, 75 other ASD). A sample weighting procedure enabled us to estimate characteristics of the total ASD population. Results Of the 75 children with ASD, 55% had an intellectual disability (IQ<70) but only 16% had moderate to severe intellectual disability (IQ<50); 28% had average intelligence (115>IQ>85) but only 3% were of above average intelligence (IQ>115). There was some evidence for a clinically significant Performance/Verbal IQ (PIQ/VIQ) discrepancy but discrepant verbal versus performance skills were not associated with a particular pattern of symptoms, as has been reported previously. There was mixed evidence of a characteristic subtest profile: whereas some previously reported patterns were supported (e.g. poor Comprehension), others were not (e.g. no ‘peak’ in Block Design). Adaptive skills were significantly lower than IQ and were associated with severity of early social impairment and also IQ. Conclusions In this epidemiological sample, ASD was less strongly associated with intellectual disability than traditionally held and there was only limited evidence of a distinctive IQ profile. Adaptive outcome was significantly impaired even for those children of average intelligence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extraction of design data for the lowpass dielectric multilayer according to Tschebysheff performance is described. The extraction proceeds initially by analogy with electric-circuit design, and can then be given numerical refinement which is also described. Agreement with the Tschebysheff desideratum is satisfactory. The multilayers extracted by this procedure are of fractional thickness, symmetric with regard to their central layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The optimal and the zero-forcing beamformers are two commonly used algorithms in the subspace-based blind beamforming technology. The optimal beamformer is regarded as the algorithm with the best output SINR. The zero-forcing algorithm emphasizes the co-channel interference cancellation. This paper compares the performance of these two algorithms under some practical conditions: the effect of the finite data length and the existence of the angle estimation error. The investigation reveals that the zero-forcing algorithm can be more robust in the practical environment than the optimal algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of data reconciliation techniques can considerably reduce the inaccuracy of process data due to measurement errors. This in turn results in improved control system performance and process knowledge. Dynamic data reconciliation techniques are applied to a model-based predictive control scheme. It is shown through simulations on a chemical reactor system that the overall performance of the model-based predictive controller is enhanced considerably when data reconciliation is applied. The dynamic data reconciliation techniques used include a combined strategy for the simultaneous identification of outliers and systematic bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the extent to which clients were able to influence performance measurement appraisals during the downturn in commercial property markets that began in the UK during the second half of 2007. The sharp change in market sentiment produced speculation that different client categories were attempting to influence their appraisers in different ways. In particular, it was recognised that the requirement for open-ended funds to meet redemptions gave them strong incentives to ensure that their asset values were marked down to market. Using data supplied by Investment Property Databank, we demonstrate that, indeed, unlisted open ended funds experienced sharper drops in capital values than other fund types in the second half of 2007, after the market turning point. These differences are statistically significant and cannot simply be explained by differences in portfolio composition. Client influence on appraisal forms one possible explanation of the results observed: the different pressures on fund managers resulting in different appraisal outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The “case for real estate” in the mixed-asset portfolio is a topic of continuing interest to practitioners and academics. The argument is typically made by comparing efficient frontiers of portfolio with real estate to those that exclude real estate. However, most investors will have held inefficient portfolios. Thus, when analysing the real estate’s place in the mixed-asset portfolio it seems illogical to do so by comparing the difference in risk-adjusted performance between efficient portfolios, which few if any investor would have held. The approach adopted here, therefore, is to compare the risk-adjusted performance of a number of mixed-asset portfolios without real estate (which may or not be efficient) with a very large number of mixed-asset portfolios that include real estate (which again may or may not be efficient), to see the proportion of the time when there is an increase in risk-adjusted performance, significant or otherwise using appraisal-based and de-smoothed annual data from 1952-2003. So to the question how often does the addition of private real estate lead to increases the risk-adjusted performance compared with mixed-asset portfolios without real estate the answer is almost all the time. However, significant increases are harder to find. Additionally, a significant increase in risk-adjusted performance can come from either reductions in portfolio risk or increases in return depending on the investors’ initial portfolio structure. In other words, simply adding real estate to a mixed-asset portfolio is not enough to ensure significant increases in performance as the results are dependent on the percentage added and the proper reallocation of the initial portfolio mix in the expanded portfolio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses data provided by three major real estate advisory firms to investigate the level and pattern of variation in the measurement of historic real estate rental values for the main European office centres. The paper assesses the extent to which the data providing organizations agree on historic market performance in terms of returns, risk and timing and examines the relationship between market maturity and agreement. The analysis suggests that at the aggregate level and for many markets, there is substantial agreement on direction, quantity and timing of market change. However, there is substantial variability in the level of agreement among cities. The paper also assesses whether the different data sets produce different explanatory models and market forecast. It is concluded that, although disagreement on the direction of market change is high for many market, the different data sets often produce similar explanatory models and predict similar relative performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines the short and long-term persistence of tax-exempt real estate funds in the UK through the use of winner-loser contingency table methodology. The persistence tests are applied to a database of varying numbers of funds from a low of 16 to a high of 27 using quarterly returns over the 12 years from 1990 Q1 to 2001 Q4. The overall conclusion is that the real estate funds in the UK show little evidence of persistence in the short-term (quarterly and semi-annual data) or for data over a considerable length of time (bi-annual to six yearly intervals). In contrast, the results are better for annual data with evidence of significant performance persistence. Thus at this stage, it seems that an annual evaluation period, provides the best discrimination of the winner and loser phenomenon in the real estate market. This result is different from equity and bond studies, where it seems that the repeat winner phenomenon is stronger over shorter periods of evaluation. These results require careful interpretation, however, as the results show that when only small samples are used significant adjustments must be made to correct for small sample bias and second the conclusions are sensitive to the length of the evaluation period and specific test used. Nonetheless, it seems that persistence in performance of real estate funds in the UK does exist, at least for the annual data, and it appears to be a guide to beating the pack in the long run. Furthermore, although the evidence of persistence in performance for the overall sample of funds is limited, we have found evidence that two funds were consistent winners over this period, whereas no one fund could be said to be a consistent loser.