958 resultados para Data frequency


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many of the Statnotes described in this series, the statistical tests assume the data are a random sample from a normal distribution These Statnotes include most of the familiar statistical tests such as the ‘t’ test, analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’). Nevertheless, many variables exhibit a more or less ‘skewed’ distribution. A skewed distribution is asymmetrical and the mean is displaced either to the left (positive skew) or to the right (negative skew). If the mean of the distribution is low, the degree of variation large, and when values can only be positive, a positively skewed distribution is usually the result. Many distributions have potentially a low mean and high variance including that of the abundance of bacterial species on plants, the latent period of an infectious disease, and the sensitivity of certain fungi to fungicides. These positively skewed distributions are often fitted successfully by a variant of the normal distribution called the log-normal distribution. This statnote describes fitting the log-normal distribution with reference to two scenarios: (1) the frequency distribution of bacterial numbers isolated from cloths in a domestic environment and (2), the sizes of lichenised ‘areolae’ growing on the hypothalus of Rhizocarpon geographicum (L.) DC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Returnable transport equipment (RTE) such as pallets form an integral part of the supply chain and poor management leads to costly losses. Companies often address this matter by outsourcing the management of RTE to logistics service providers (LSPs). LSPs are faced with the task to provide logistical expertise to reduce RTE related waste, whilst differentiating their own services to remain competitive. In the current challenging economic climate, the role of the LSP to deliver innovative ways to achieve competitive advantage has never been so important. It is reported that radio frequency identification (RFID) application to RTE enables LSPs such as DHL to gain competitive advantage and offer clients improvements such as loss reduction, process efficiency improvement and effective security. However, the increased visibility and functionality of RFID enabled RTE requires further investigation in regards to decision‐making. The distributed nature of the RTE network favours a decentralised decision‐making format. Agents are an effective way to represent objects from the bottom‐up, capturing the behaviour and enabling localised decision‐making. Therefore, an agent based system is proposed to represent the RTE network and utilise the visibility and data gathered from RFID tags. Two types of agents are developed in order to represent the trucks and RTE, which have bespoke rules and algorithms in order to facilitate negotiations. The aim is to create schedules, which integrate RTE pick‐ups as the trucks go back to the depot. The findings assert that: - agent based modelling provides an autonomous tool, which is effective in modelling RFID enabled RTE in a decentralised utilising the real‐time data facility. ‐ the RFID enabled RTE model developed enables autonomous agent interaction, which leads to a feasible schedule integrating both forward and reverse flows for each RTE batch. ‐ the RTE agent scheduling algorithm developed promotes the utilisation of RTE by including an automatic return flow for each batch of RTE, whilst considering the fleet costs andutilisation rates. ‐ the research conducted contributes an agent based platform, which LSPs can use in order to assess the most appropriate strategies to implement for RTE network improvement for each of their clients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a newly designed polymer light-emitting diode with a bandwidth of ∼350 kHz for high-speed visible light communications. Using this new polymer light-emitting diode as a transmitter, we have achieved a record transmission speed of 10 Mb/s for a polymer light-emitting diode-based optical communication system with an orthogonal frequency division multiplexing technique, matching the performance of single carrier formats using multitap equalization. For achieving such a high data-rate, a power pre-emphasis technique was adopted. © 2014 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Objective: To maximise the benefit from statin therapy, patients must maintain regular therapy indefinitely. Non-compliance is thought to be common in those taking medication at regular intervals over long periods of time, especially where they may perceive no immediate benefit (News editorial, 2002). This study extends previous work in which commonly held prescribing data is used as a surrogate marker of compliance and was designed to examine compliance in those stabilised on statins in a large General Practice. Design: Following ethical approval, details of all patients who had a single statin for 12 consecutive months with no changes in drug, frequency or dose, between December 1999 and March 2003, were obtained. Setting: An Eastern Birmingham Primary Care Trust GP surgery. Main Outcome Measures: A compliance ratio was calculated by dividing the number of days treatment by the number of doses prescribed. For a once daily regimen the ratio for full compliance_1. Results: 324 patients were identified. The average compliance ratio for the first six months of the study was 1.06 ± 0.01 (range 0.46 – 2.13) and for the full twelve months was 1.05 ± 0.01 (range 0.58 – 2.08). Conclusions: The data shown here indicates that as a group, long-term, stabilised statin users appear compliant. However, the range of values obtained show that there are identifiable subsets of patients who are not taking their therapy as prescribed. Although the apparent use of more doses than prescribed in some patients may result from medication hording, this cannot be the case in the patients who apparently take less. It has been demonstrated here that the compliance ratio can be used as an early indicator of problems allowing targeted compliance advice can be given where it will have the most benefit. References: News Editorial. Pharmacy records could be used to enhance statin compliance in elderly. Pharm. J. 2002; 269: 121.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bio-impedance analysis (BIA) provides a rapid, non-invasive technique for body composition estimation. BIA offers a convenient alternative to standard techniques such as MRI, CT scan or DEXA scan for selected types of body composition analysis. The accuracy of BIA is limited because it is an indirect method of composition analysis. It relies on linear relationships between measured impedance and morphological parameters such as height and weight to derive estimates. To overcome these underlying limitations of BIA, a multi-frequency segmental bio-impedance device was constructed through a series of iterative enhancements and improvements of existing BIA instrumentation. Key features of the design included an easy to construct current-source and compact PCB design. The final device was trialled with 22 human volunteers and measured impedance was compared against body composition estimates obtained by DEXA scan. This enabled the development of newer techniques to make BIA predictions. To add a ‘visual aspect’ to BIA, volunteers were scanned in 3D using an inexpensive scattered light gadget (Xbox Kinect controller) and 3D volumes of their limbs were compared with BIA measurements to further improve BIA predictions. A three-stage digital filtering scheme was also implemented to enable extraction of heart-rate data from recorded bio-electrical signals. Additionally modifications have been introduced to measure change in bio-impedance with motion, this could be adapted to further improve accuracy and veracity for limb composition analysis. The findings in this thesis aim to give new direction to the prediction of body composition using BIA. The design development and refinement applied to BIA in this research programme suggest new opportunities to enhance the accuracy and clinical utility of BIA for the prediction of body composition analysis. In particular, the use of bio-impedance to predict limb volumes which would provide an additional metric for body composition measurement and help distinguish between fat and muscle content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fierce competition within the third party logistics (3PL) market has developed as providers compete to win customers and enhance their competitive advantage through cost reduction plans and creating service differentiation. 3PL providers are expected to develop advanced technological and logistical service applications that can support cost reduction while increasing service innovation. To enhance competitiveness, this paper proposes the implementation of radio-frequency identification (RFID) enabled returnable transport equipment (RTE) in combination with the consolidation of network assets and cross-docking. RFID enabled RTE can significantly improve network visibility of all assets with continuous real-time data updates. A four-level cyclic model aiding 3PL providers to achieve competitive advantage has been developed. The focus is to reduce assets, increase asset utilisation, reduce RTE cycle time and introduce real-time data in the 3PL network. Furthermore, this paper highlights the need for further research from the 3PL perspective. Copyright © 2013 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this investigation was to interpret the bitumen-aggregate adhesion based on the dielectric spectroscopic response of individual material components utilizing their dielectric constants, refractive indices and average tangent of the dielectric loss angle (average loss tangent). Dielectric spectroscopy of bitumen binders at room temperature was performed in the frequency range of 0.01–1000 Hz. Dielectric spectroscopy is an experimental method for characterizing the dielectric permittivity of a material as a function of frequency. Adhesion data has been determined using the Rolling bottle method. The results show that the magnitude of the average tangent of the dielectric loss angle (average loss tangent) depends on bitumen type. The average loss tangent in the frequency range 0.01–1 Hz is introduced as a potential indicator for predicting polarizability and, thereby, adhesion potential of bitumen binders to quartz aggregates when using Portland cement. In order to obtain acceptable adhesion of 70/100 penetration grade bitumen binders and quartz aggregates when using Portland cement, it is suggested that the binder have an average tan δ > 0.035 in the frequency range 0.01–1 Hz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whole body vibration (WBV) aims to mechanically activate muscles by eliciting stretch reflexes. Mechanical vibrations are usually transmitted to the patient body standing on a oscillating plate. WBV is now more and more utilized not only for fitness but also in physical therapy, rehabilitation and in sport medicine. Effects depend on intensity, direction and frequency of vibration; however, the training frequency is one of the most important factors involved. A preliminary vibratory session can be dedicated to find the best vibration frequency for each subject by varying, stepwise, the stimulation frequency and analyzing the resulting EMG activity. This study concentrates on the analysis of muscle motion in response to a vibration frequency sweep, while subjects held two different postures. The frequency of a vibrating platform was increased linearly from 10 to 60 Hz in 26 s, while platform and single muscles (Rectus Femoris, Biceps Femoris - long head and Gastrocnemius Lateralis) motions were monitored using tiny, lightweight three-axial MEMS accelerometers. Displacements were estimated integrating twice the acceleration data after gravity contribution removal. Mechanical frequency response (amplitude and phase) of the mechanical chains ending at the single muscles was characterized. Results revealed a mechanical resonant-like behavior at some muscles, very similar to a second-order system in the frequency interval explored; resonance frequencies and dumping factors depended on subject and its positioning onto the vibrating platform. Stimulation at the resonant frequency maximizes muscle lengthening, and in turn muscle spindle solicitation, which produce muscle activation. © 2009 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a robust adaptive time synchronization and frequency offset estimation method for coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems by applying electrical dispersion pre-compensation (pre-EDC) to the pilot symbol. This technique effectively eliminates the timing error due to the fiber chromatic dispersion, thus increasing significantly the accuracy of the frequency offset estimation process and improving the overall system performance. In addition, a simple design of the pilot symbol is proposed for full-range frequency offset estimation. This pilot symbol can also be used to carry useful data to effectively reduce the overhead due to time synchronization by a factor of 2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sheer volume of citizen weather data collected and uploaded to online data hubs is immense. However as with any citizen data it is difficult to assess the accuracy of the measurements. Within this project we quantify just how much data is available, where it comes from, the frequency at which it is collected, and the types of automatic weather stations being used. We also list the numerous possible sources of error and uncertainty within citizen weather observations before showing evidence of such effects in real data. A thorough intercomparison field study was conducted, testing popular models of citizen weather stations. From this study we were able to parameterise key sources of bias. Most significantly the project develops a complete quality control system through which citizen air temperature observations can be passed. The structure of this system was heavily informed by the results of the field study. Using a Bayesian framework the system learns and updates its estimates of the calibration and radiation-induced biases inherent to each station. We then show the benefit of correcting for these learnt biases over using the original uncorrected data. The system also attaches an uncertainty estimate to each observation, which would provide real world applications that choose to incorporate such observations with a measure on which they may base their confidence in the data. The system relies on interpolated temperature and radiation observations from neighbouring professional weather stations for which a Bayesian regression model is used. We recognise some of the assumptions and flaws of the developed system and suggest further work that needs to be done to bring it to an operational setting. Such a system will hopefully allow applications to leverage the additional value citizen weather data brings to longstanding professional observing networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents Radio Frequency Identification (RFID), which is one of the Automatic Identification and Data Capture (AIDC) technologies (Wamba and Boeck, 2008) and discusses the application of RFID in E-Commerce. Firstly RFID is defined and the tag and reader components of the RFID system are explained. Then historical context of RFID is briefly discussed. Next, RFID is contrasted with other AIDC technologies, especially the use of barcodes which are commonly applied in E-Commerce. Lastly, RFID applications in E-Commerce are discussed with the focus on achievable benefits and obstacles to successful applications of RFID in E-Commerce, and ways to alleviate them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, wireless network technology has grown at such a pace that scientific research has become a practical reality in a very short time span. Mobile wireless communications have witnessed the adoption of several generations, each of them complementing and improving the former. One mobile system that features high data rates and open network architecture is 4G. Currently, the research community and industry, in the field of wireless networks, are working on possible choices for solutions in the 4G system. 4G is a collection of technologies and standards that will allow a range of ubiquitous computing and wireless communication architectures. The researcher considers one of the most important characteristics of future 4G mobile systems the ability to guarantee reliable communications from 100 Mbps, in high mobility links, to as high as 1 Gbps for low mobility users, in addition to high efficiency in the spectrum usage. On mobile wireless communications networks, one important factor is the coverage of large geographical areas. In 4G systems, a hybrid satellite/terrestrial network is crucial to providing users with coverage wherever needed. Subscribers thus require a reliable satellite link to access their services when they are in remote locations, where a terrestrial infrastructure is unavailable. Thus, they must rely upon satellite coverage. Good modulation and access technique are also required in order to transmit high data rates over satellite links to mobile users. This technique must adapt to the characteristics of the satellite channel and also be efficient in the use of allocated bandwidth. Satellite links are fading channels, when used by mobile users. Some measures designed to approach these fading environments make use of: (1) spatial diversity (two receive antenna configuration); (2) time diversity (channel interleaver/spreading techniques); and (3) upper layer FEC. The author proposes the use of OFDM (Orthogonal Frequency Multiple Access) for the satellite link by increasing the time diversity. This technique will allow for an increase of the data rate, as primarily required by multimedia applications, and will also optimally use the available bandwidth. In addition, this dissertation approaches the use of Cooperative Satellite Communications for hybrid satellite/terrestrial networks. By using this technique, the satellite coverage can be extended to areas where there is no direct link to the satellite. For this purpose, a good channel model is necessary.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate and reliable estimation of travel time based on point detector data is needed to support Intelligent Transportation System (ITS) applications. It has been found that the quality of travel time estimation is a function of the method used in the estimation and varies for different traffic conditions. In this study, two hybrid on-line travel time estimation models, and their corresponding off-line methods, were developed to achieve better estimation performance under various traffic conditions, including recurrent congestion and incidents. The first model combines the Mid-Point method, which is a speed-based method, with a traffic flow-based method. The second model integrates two speed-based methods: the Mid-Point method and the Minimum Speed method. In both models, the switch between travel time estimation methods is based on the congestion level and queue status automatically identified by clustering analysis. During incident conditions with rapidly changing queue lengths, shock wave analysis-based refinements are applied for on-line estimation to capture the fast queue propagation and recovery. Travel time estimates obtained from existing speed-based methods, traffic flow-based methods, and the models developed were tested using both simulation and real-world data. The results indicate that all tested methods performed at an acceptable level during periods of low congestion. However, their performances vary with an increase in congestion. Comparisons with other estimation methods also show that the developed hybrid models perform well in all cases. Further comparisons between the on-line and off-line travel time estimation methods reveal that off-line methods perform significantly better only during fast-changing congested conditions, such as during incidents. The impacts of major influential factors on the performance of travel time estimation, including data preprocessing procedures, detector errors, detector spacing, frequency of travel time updates to traveler information devices, travel time link length, and posted travel time range, were investigated in this study. The results show that these factors have more significant impacts on the estimation accuracy and reliability under congested conditions than during uncongested conditions. For the incident conditions, the estimation quality improves with the use of a short rolling period for data smoothing, more accurate detector data, and frequent travel time updates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method to estimate speed of free-ranging fishes using a passive sampling device is described and illustrated with data from the Everglades, U.S.A. Catch per unit effort (CPUE) from minnow traps embedded in drift fences was treated as an encounter rate and used to estimate speed, when combined with an independent estimate of density obtained by use of throw traps that enclose 1 m2 of marsh habitat. Underwater video was used to evaluate capture efficiency and species-specific bias of minnow traps and two sampling studies were used to estimate trap saturation and diel-movement patterns; these results were used to optimize sampling and derive correction factors to adjust species-specific encounter rates for bias and capture efficiency. Sailfin mollies Poecilia latipinna displayed a high frequency of escape from traps, whereas eastern mosquitofish Gambusia holbrooki were most likely to avoid a trap once they encountered it; dollar sunfish Lepomis marginatus were least likely to avoid the trap once they encountered it or to escape once they were captured. Length of sampling and time of day affected CPUE; fishes generally had a very low retention rate over a 24 h sample time and only the Everglades pygmy sunfish Elassoma evergladei were commonly captured at night. Dispersal speed of fishes in the Florida Everglades, U.S.A., was shown to vary seasonally and among species, ranging from 0· 05 to 0· 15 m s−1 for small poeciliids and fundulids to 0· 1 to 1· 8 m s−1 for L. marginatus. Speed was generally highest late in the wet season and lowest in the dry season, possibly tied to dispersal behaviours linked to finding and remaining in dry-season refuges. These speed estimates can be used to estimate the diffusive movement rate, which is commonly employed in spatial ecological models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lake Analyzer is a numerical code coupled with supporting visualization tools for determining indices of mixing and stratification that are critical to the biogeochemical cycles of lakes and reservoirs. Stability indices, including Lake Number, Wedderburn Number, Schmidt Stability, and thermocline depth are calculated according to established literature definitions and returned to the user in a time series format. The program was created for the analysis of high-frequency data collected from instrumented lake buoys, in support of the emerging field of aquatic sensor network science. Available outputs for the Lake Analyzer program are: water temperature (error-checked and/or down-sampled), wind speed (error-checked and/or down-sampled), metalimnion extent (top and bottom), thermocline depth, friction velocity, Lake Number, Wedderburn Number, Schmidt Stability, mode-1 vertical seiche period, and Brunt-Väisälä buoyancy frequency. Secondary outputs for several of these indices delineate the parent thermocline depth (seasonal thermocline) from the shallower secondary or diurnal thermocline. Lake Analyzer provides a program suite and best practices for the comparison of mixing and stratification indices in lakes across gradients of climate, hydro-physiography, and time, and enables a more detailed understanding of the resulting biogeochemical transformations at different spatial and temporal scales.