947 resultados para Link quality estimation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fine-scale spatial genetic structure (SGS) in natural tree populations is largely a result of restricted pollen and seed dispersal. Understanding the link between limitations to dispersal in gene vectors and SGS is of key interest to biologists and the availability of highly variable molecular markers has facilitated fine-scale analysis of populations. However, estimation of SGS may depend strongly on the type of genetic marker and sampling strategy (of both loci and individuals). To explore sampling limits, we created a model population with simulated distributions of dominant and codominant alleles, resulting from natural regeneration with restricted gene flow. SGS estimates from subsamples (simulating collection and analysis with amplified fragment length polymorphism (AFLP) and microsatellite markers) were correlated with the 'real' estimate (from the full model population). For both marker types, sampling ranges were evident, with lower limits below which estimation was poorly correlated and upper limits above which sampling became inefficient. Lower limits (correlation of 0.9) were 100 individuals, 10 loci for microsatellites and 150 individuals, 100 loci for AFLPs. Upper limits were 200 individuals, five loci for microsatellites and 200 individuals, 100 loci for AFLPs. The limits indicated by simulation were compared with data sets from real species. Instances where sampling effort had been either insufficient or inefficient were identified. The model results should form practical boundaries for studies aiming to detect SGS. However, greater sample sizes will be required in cases where SGS is weaker than for our simulated population, for example, in species with effective pollen/seed dispersal mechanisms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between human resource management practices and organizational performance (including quality of care in health-care organizations) is an important topic in the organizational sciences but little research has been conducted examining this relationship in hospital settings. Human resource (HR) directors from sixty-one acute hospitals in England (Hospital Trusts) completed questionnaires or interviews exploring HR practices and procedures. The interviews probed for information about the extensiveness and sophistication of appraisal for employees, the extent and sophistication of training for employees and the percentage of staff working in teams. Data on patient mortality were also gathered. The findings revealed strong associations between HR practices and patient mortality generally. The extent and sophistication of appraisal in the hospitals was particularly strongly related, but there were links too with the sophistication of training for staff, and also with the percentages of staff working in teams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A broad based approach has been used to assess the impact of discharges to rivers from surface water sewers, with the primary objective of determining whether such discharges have a measurable impact on water quality. Three parameters, each reflecting the effects of intermittent pollution, were included in a field work programme of biological and chemical sampling and analysis which covered 47 sewer outfall sites. These parameters were the numbers and types of benthic macroinvertebrates upstream and downstream of the outfalls, the concentrations of metals in sediments, and the concentrations of metals in algae upstream and downstream of the outfalls. Information on the sewered catchments was collected from Local Authorities and by observation of the time of sampling, and includes catchment areas, land uses, evidence of connection to the foul system, and receiving water quality classification. The methods used for site selection, sampling, laboratory analysis and data analysis are fully described, and the survey results presented. Statistical and graphical analysis of the biological data, with the aid of BMWP scores, showed that there was a small but persistent fall in water quality downstream of the studied outfalls. Further analysis including the catchment information indicated that initial water quality, sewered catchment size, receiving stream size, and catchment land use were important factors in determining the impact. Finally, the survey results were used to produce guidelines for the estimation of surface water sewer discharge impacts from knowledge of the catchment characteristics, so that planning authorities can consider water quality when new drainage systems are designed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increasingly, retailers nowadays have to focus on service marketing strategies and tactics to differentiate themselves from their competitors. Delivering high levels of service quality becomes crucial for long-term success. Since customers' perception of service quality depends very much on the interaction between the customer and the employee, this study analyzes the link between employee and customer satisfaction in more detail. Moreover, based on three different theories that prior research has used, it investigates whether or not the level of customer contact is a determinant of the existence or the intensity of the employee–customer satisfaction link. Analysis of dyadic data from 53,645 customers and 1659 employees across 99 outlets of a large German Do-It-Yourself (DIY)-retailer shows that employee job satisfaction affects customer satisfaction even for employee groups that are not in direct interaction with customers, although effects seem to be slightly stronger for high interaction groups. Implications for research and management are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compares interpreter-mediated face-to-face Magistrates Court hearings with those conducted through prison video link in which interpreters are located in court and non- English-speaking defendants in prison. It seeks to examine the impact that the presence of video link has on court actors in terms of interaction and behaviour. The data comprises 11 audio-recordings of face-to-face hearings, 10 recordings of prison video link hearings, semistructured interviews with 27 court actors, and ethnographic observation of hearings as viewed by defendants in Wormwood Scrubs prison in London. The over-arching theme is the pervasive influence of the ecology of the courtroom upon all court actors in interpretermediated hearings and thus on the communication process. Close analysis of the court transcripts shows that their relative proximity to one another can be a determinant of status, interpreting role, mode and volume. The very few legal protocols which apply to interpretermediated cases (acknowledging and ratifying the interpreter, for example), are often forgotten or dispensed with. Court interpreters lack proper training in the specific challenges of court interpreting, whether they are co-present with the defendant or not. Other court actors often misunderstand the interpreter’s role. This has probably come about because courts have adjusted their perceptions of what they think interpreters are supposed to do based on their own experiences of working with them, and have gradually come to accept poor practice (the inability to perform simultaneous interpreting, for example) as the norm. In video link courts, mismatches of sound and image due to court clerks’ failure to adequately track current speakers, poor image and sound quality and the fact that non-English-speaking defendants in pre-and post-court consultations can see and hear interpreters but not their defence advocates are just some of the additional layers of disadvantage and confusion already suffered by non- English-speaking defendants. These factors make it less likely that justice will be done.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to investigate the impact of human resource (HR) practices on organizational performance through the mediating role of psychological contract (expressed by the influence of employer on employee promises fulfillment through employee attitudes). The study is based on a national sample of 78 organizations from the public and private services sector in Greece, including education, health, and banking, and on data obtained from 348 employees. The statistical method employed is structural equation modeling, via LISREL and bootstrapping estimation. The findings of the study suggest that employee incentives, performance appraisal, and employee promotion are three major HR practices that must be extensively employed. Furthermore, the study suggests that the organization must primarily keep its promises about a pleasant and safe working environment, respectful treatment, and feedback for performance, in order for employees to largely keep their own promises about showing loyalty to the organization, maintaining high levels of attendance, and upholding company reputation. Additionally, the study argues that the employee attitudes of motivation, satisfaction, and commitment constitute the nested epicenter mediating construct in both the HR practices–performance and employer–employee promise fulfillment relationships, resulting in superior organizational performance. © 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Along with other diseases that can affect binocular vision, reducing the visual quality of a subject, Congenital Nystagmus (CN) is of peculiar interest. CN is an ocular-motor disorder characterized by involuntary, conjugated ocular oscillations and, while identified more than forty years ago, its pathogenesis is still under investigation. This kind of nystagmus is termed congenital (or infantile) since it could be present at birth or it can arise in the first months of life. The majority of CN patients show a considerable decrease of their visual acuity: image fixation on the retina is disturbed by nystagmus continuous oscillations, mainly horizontal. However, the image of a given target can still be stable during short periods in which eye velocity slows down while the target image is placed onto the fovea (called foveation intervals). To quantify the extent of nystagmus, eye movement recordings are routinely employed, allowing physicians to extract and analyze nystagmus main features such as waveform shape, amplitude and frequency. Use of eye movement recording, opportunely processed, allows computing "estimated visual acuity" predictors, which are analytical functions that estimate expected visual acuity using signal features such as foveation time and foveation position variability. Hence, it is fundamental to develop robust and accurate methods to measure both those parameters in order to obtain reliable values from the predictors. In this chapter the current methods to record eye movements in subjects with congenital nystagmus will be discussed and the present techniques to accurately compute foveation time and eye position will be presented. This study aims to disclose new methodologies in congenital nystagmus eye movements analysis, in order to identify nystagmus cycles and to evaluate foveation time, reducing the influence of repositioning saccades and data noise on the critical parameters of the estimation functions. Use of those functions extends the information acquired with typical visual acuity measurement (e.g., Landolt C test) and could be a support for treatment planning or therapy monitoring. © 2010 by Nova Science Publishers, Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The long-term foetal surveillance is often to be recommended. Hence, the fully non-invasive acoustic recording, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the recorded heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. In this paper, we present a new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings. A filtering is employed as a first step of the algorithm to reduce the background noise. A block for first heart sounds enhancing is then used to further reduce other components of foetal heart sound signals. A complex logic block, guided by a number of rules concerning foetal heart beat regularity, is proposed as a successive block, for the detection of most probable first heart sounds from several candidates. A final block is used for exact first heart sound timing and in turn foetal heart rate estimation. Filtering and enhancing blocks are actually implemented by means of different techniques, so that different processing paths are proposed. Furthermore, a reliability index is introduced to quantify the consistency of the estimated foetal heart rate and, based on statistic parameters; [,] a software quality index is designed to indicate the most reliable analysis procedure (that is, combining the best processing path and the most accurate time mark of the first heart sound, provides the lowest estimation errors). The algorithm performances have been tested on phonocardiographic signals recorded in a local gynaecology private practice from a sample group of about 50 pregnant women. Phonocardiographic signals have been recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by our algorithm and the other provided by cardiotocographic device). Our results show that the proposed algorithm, in particular some analysis procedures, provides reliable foetal heart rate signals, very close to the reference cardiotocographic recordings. © 2010 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: In any macromolecular polyprotic system - for example protein, DNA or RNA - the isoelectric point - commonly referred to as the pI - can be defined as the point of singularity in a titration curve, corresponding to the solution pH value at which the net overall surface charge - and thus the electrophoretic mobility - of the ampholyte sums to zero. Different modern analytical biochemistry and proteomics methods depend on the isoelectric point as a principal feature for protein and peptide characterization. Protein separation by isoelectric point is a critical part of 2-D gel electrophoresis, a key precursor of proteomics, where discrete spots can be digested in-gel, and proteins subsequently identified by analytical mass spectrometry. Peptide fractionation according to their pI is also widely used in current proteomics sample preparation procedures previous to the LC-MS/MS analysis. Therefore accurate theoretical prediction of pI would expedite such analysis. While such pI calculation is widely used, it remains largely untested, motivating our efforts to benchmark pI prediction methods. Results: Using data from the database PIP-DB and one publically available dataset as our reference gold standard, we have undertaken the benchmarking of pI calculation methods. We find that methods vary in their accuracy and are highly sensitive to the choice of basis set. The machine-learning algorithms, especially the SVM-based algorithm, showed a superior performance when studying peptide mixtures. In general, learning-based pI prediction methods (such as Cofactor, SVM and Branca) require a large training dataset and their resulting performance will strongly depend of the quality of that data. In contrast with Iterative methods, machine-learning algorithms have the advantage of being able to add new features to improve the accuracy of prediction. Contact: yperez@ebi.ac.uk Availability and Implementation: The software and data are freely available at https://github.com/ypriverol/pIR. Supplementary information: Supplementary data are available at Bioinformatics online.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-volume capacitance is required to buffer the power difference between the input and output ports in single-phase grid-connected photovoltaic inverters, which become an obstacle to high system efficiency and long device lifetime. Furthermore, total harmonic distortion becomes serious when the system runs into low power level. In this study, a comprehensive analysis is introduced for two-stage topology with the consideration of active power, DC-link (DCL) voltage, ripple and capacitance. This study proposed a comprehensive DCL voltage control strategy to minimise the DCL capacitance while maintaining a normal system operation. Furthermore, the proposed control strategy is flexible to be integrated with the pulse-skipping control that significantly improves the power quality at light power conditions. Since the proposed control strategy needs to vary DCL voltage, an active protection scheme is also introduced to prevent any voltage violation across the DCL. The proposed control strategy is evaluated by both simulation and experiments, whose results confirm the system effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last couple of years there has been an ongoing debate on how sales managers contribute to organizational value. Direct measures between sales-marketing interface quality and company performance are compromised, as company performance is influenced by a plethora of other factors. We advocate that the use of sales information is the missing link between sales-marketing relationship quality and organizational outcomes. We propose and empirically test a model on how sales-marketing interface quality affects managerial use of sales information, which in turn leads to enhanced organizational performance. We found that marketing managers rely on sales information if they think that their sales counterpart is trustworthy. Integration between the sales-marketing function contributes to a trust-based relationship.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to design a preventive scheme using directional antennas to improve the performance of mobile ad hoc networks. In this dissertation, a novel Directionality based Preventive Link Maintenance (DPLM) Scheme is proposed to characterize the performance gain [JaY06a, JaY06b, JCY06] by extending the life of link. In order to maintain the link and take preventive action, signal strength of data packets is measured. Moreover, location information or angle of arrival information is collected during communication and saved in the table. When measured signal strength is below orientation threshold , an orientation warning is generated towards the previous hop node. Once orientation warning is received by previous hop (adjacent) node, it verifies the correctness of orientation warning with few hello pings and initiates high quality directional link (a link above the threshold) and immediately switches to it, avoiding a link break altogether. The location information is utilized to create a directional link by orienting neighboring nodes antennas towards each other. We call this operation an orientation handoff, which is similar to soft-handoff in cellular networks. ^ Signal strength is the indicating factor, which represents the health of the link and helps to predict the link failure. In other words, link breakage happens due to node movement and subsequently reducing signal strength of receiving packets. DPLM scheme helps ad hoc networks to avoid or postpone costly operation of route rediscovery in on-demand routing protocols by taking above-mentioned preventive action. ^ This dissertation advocates close but simple collaboration between the routing, medium access control and physical layers. In order to extend the link, the Dynamic Source Routing (DSR) and IEEE 802.11 MAC protocols were modified to use the ability of directional antennas to transmit over longer distance. A directional antenna module is implemented in OPNET simulator with two separate modes of operations: omnidirectional and directional. The antenna module has been incorporated in wireless node model and simulations are performed to characterize the performance improvement of mobile ad hoc networks. Extensive simulations have shown that without affecting the behavior of the routing protocol noticeably, aggregate throughput, packet delivery ratio, end-to-end delay (latency), routing overhead, number of data packets dropped, and number of path breaks are improved considerably. We have done the analysis of the results in different scenarios to evaluate that the use of directional antennas with proposed DPLM scheme has been found promising to improve the performance of mobile ad hoc networks. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation aimed to improve travel time estimation for the purpose of transportation planning by developing a travel time estimation method that incorporates the effects of signal timing plans, which were difficult to consider in planning models. For this purpose, an analytical model has been developed. The model parameters were calibrated based on data from CORSIM microscopic simulation, with signal timing plans optimized using the TRANSYT-7F software. Independent variables in the model are link length, free-flow speed, and traffic volumes from the competing turning movements. The developed model has three advantages compared to traditional link-based or node-based models. First, the model considers the influence of signal timing plans for a variety of traffic volume combinations without requiring signal timing information as input. Second, the model describes the non-uniform spatial distribution of delay along a link, this being able to estimate the impacts of queues at different upstream locations of an intersection and attribute delays to a subject link and upstream link. Third, the model shows promise of improving the accuracy of travel time prediction. The mean absolute percentage error (MAPE) of the model is 13% for a set of field data from Minnesota Department of Transportation (MDOT); this is close to the MAPE of uniform delay in the HCM 2000 method (11%). The HCM is the industrial accepted analytical model in the existing literature, but it requires signal timing information as input for calculating delays. The developed model also outperforms the HCM 2000 method for a set of Miami-Dade County data that represent congested traffic conditions, with a MAPE of 29%, compared to 31% of the HCM 2000 method. The advantages of the proposed model make it feasible for application to a large network without the burden of signal timing input, while improving the accuracy of travel time estimation. An assignment model with the developed travel time estimation method has been implemented in a South Florida planning model, which improved assignment results.