834 resultados para raw-fibre-to-end-product


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This special issue provides the latest research and development on wireless mobile wearable communications. According to a report by Juniper Research, the market value of connected wearable devices is expected to reach $1.5 billion by 2014, and the shipment of wearable devices may reach 70 million by 2017. Good examples of wearable devices are the prominent Google Glass and Microsoft HoloLens. As wearable technology is rapidly penetrating our daily life, mobile wearable communication is becoming a new communication paradigm. Mobile wearable device communications create new challenges compared to ordinary sensor networks and short-range communication. In mobile wearable communications, devices communicate with each other in a peer-to-peer fashion or client-server fashion and also communicate with aggregation points (e.g., smartphones, tablets, and gateway nodes). Wearable devices are expected to integrate multiple radio technologies for various applications' needs with small power consumption and low transmission delays. These devices can hence collect, interpret, transmit, and exchange data among supporting components, other wearable devices, and the Internet. Such data are not limited to people's personal biomedical information but also include human-centric social and contextual data. The success of mobile wearable technology depends on communication and networking architectures that support efficient and secure end-to-end information flows. A key design consideration of future wearable devices is the ability to ubiquitously connect to smartphones or the Internet with very low energy consumption. Radio propagation and, accordingly, channel models are also different from those in other existing wireless technologies. A huge number of connected wearable devices require novel big data processing algorithms, efficient storage solutions, cloud-assisted infrastructures, and spectrum-efficient communications technologies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article examines the influence on the engineering design process of the primary objective of validation, whether it is proving a model, a technology or a product. Through the examination of a number of stiffened panel case studies, the relationships between simulation, validation, design and the final product are established and discussed. The work demonstrates the complex interactions between the original (or anticipated) design model, the analysis model, the validation activities and the product in service. The outcome shows clearly some unintended consequences. High fidelity validation test simulations require a different set of detailed parameters to accurately capture behaviour. By doing so, there is a divergence from the original computer-aided design model, intrinsically limiting the value of the validation with respect to the product. This work represents a shift from the traditional perspective of encapsulating and controlling errors between simulation and experimental test to consideration of the wider design-test process. Specifically, it is a reflection on the implications of how models are built and validated, and the effect on results and understanding of structural behaviour. This article then identifies key checkpoints in the design process and how these should be used to update the computer-aided design system parameters for a design. This work strikes at a fundamental challenge in understanding the interaction between design, certification and operation of any complex system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Medicines reconciliation is a way to identify and act on discrepancies in patients’ medical histories and it is found to play a key role in patient safety. This review focuses on discrepancies and medical errors that occurred at point of discharge from hospital. Studies were identified through the following electronic databases: PubMed, Sciences Direct, EMBASE, Google Scholar, Cochrane Reviews and CINAHL. Each of the six databases was screened from inception to end of January 2014. To determine eligibility of the studies; the title, abstract and full manuscript were screened to find 15 articles that meet the inclusion criteria. The median number of discrepancies across the articles was found to be 60%. In average patient had between 1.2–5.3 discrepancies when leaving the hospital. More studies also found a relation between the numbers of drugs a patient was on and the number of discrepancies. The variation in the number of discrepancies found in the 15 studies could be due to the fact that some studies excluded patient taking more than 5 drugs at admission. Medication reconciliation would be a way to avoid the high number of discrepancies that was found in this literature review and thereby increase patient safety.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A theoretical analysis is reported in this paper to investigate the effect that a second harmonic signal which might be present at an amplifier’s input has on generating additional intermodulation products, particularly the third-order intermodulation (IM3) products. The analysis shows that the amplitude of an extra generated IM3 component is equal to the product of the fundamental amplitude, the second harmonic amplitude, and the second order Taylor series coefficient. The effect of the second order harmonic on the IM3 is examined through a simulated example of a 2.22-GHz 10-W Class-EF amplifier whereby the IM3 levels have been reduced by 2-3 dB after employing a second harmonic termination stub at the input.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the development and deployment of IEC 61850 based smart substations, cybersecurity vulnerabilities of supervisory control and data acquisition (SCADA) systems are increasingly emerging. In response to the emergence of cybersecurity vulnerabilities in smart substations, a test-bed is indispensable to enable cybersecurity experimentation. In this paper, a comprehensive and realistic cyber-physical test-bed has been built to investigate potential cybersecurity vulnerabilities and the impact of cyber-attacks on IEC 61850 based smart substations. This test-bed is close to a real production type environment, and has the ability to carry out end-to-end testing of cyber-attacks and physical consequences. A fuzz testing approach is proposed for detecting IEC 61850 based intelligent electronic devices (IEDs) and validated in the proposed test-bed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A multiuser dual-hop relaying system over mixed radio frequency/free-space optical (RF/FSO) links is investigated. Specifically, the system consists of m single-antenna sources, a relay node equipped with n≥ m receive antennas and a single photo-aperture transmitter, and one destination equipped with a single photo-detector. RF links are used for the simultaneous data transmission from multiple sources to the relay. The relay operates under the decode-and-forward protocol and utilizes the popular V-BLAST technique by successively decoding each user's transmitted stream. Two common norm-based orderings are adopted, i.e., the streams are decoded in an ascending or a descending order. After V-BLAST, the relay retransmits the decoded information to the destination via a point-to-point FSO link in m consecutive timeslots. Analytical expressions for the end-to-end outage probability and average symbol error probability of each user are derived, while closed-form asymptotic expressions are also presented. Capitalizing on the derived results, some engineering insights are manifested, such as the coding and diversity gain of each user, the impact of the pointing error displacement on the FSO link and the V-BLAST ordering effectiveness at the relay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The next generation sequencing revolution has enabled rapid discovery of genetic markers, however, development of fully functioning new markers still requires a long and costly process of marker validation. This study reports a rapid and economical approach for the validation and deployment of polymorphic microsatellite markers obtained from a 454 pyrosequencing library of Atlantic cod, Gadus morhua, Linnaeus 1758. Primers were designed from raw reads to amplify specific amplicon size ranges, allowing effective PCR multiplexing. Multiplexing was combined with a three-primer PCR approach using four universal tails to label amplicons with separate fluorochromes. A total of 192 primer pairs were tested, resulting in 73 polymorphic markers. Of these, 55 loci were combined in six multiplex panels each containing between six and eleven markers. Variability of the loci was assessed on G. morhua from the Celtic Sea (n 46) and the Scotian Shelf (n 46), two locations that have shown genetic differentiation in previous studies. Multilocus FST between the two samples was estimated at 0.067 (P 0.001). After three loci potentially under selection were excluded, the global FST was estimated at 0.043 (P 0.001). Our technique combines three- primer and multiplex PCR techniques, allowing simultaneous screening and validation of relatively large numbers of microsatellite loci.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study introduces an inexact, but ultra-low power, computing architecture devoted to the embedded analysis of bio-signals. The platform operates at extremely low voltage supply levels to minimise energy consumption. In this scenario, the reliability of static RAM (SRAM) memories cannot be guaranteed when using conventional 6-transistor implementations. While error correction codes and dedicated SRAM implementations can ensure correct operations in this near-threshold regime, they incur in significant area and energy overheads, and should therefore be employed judiciously. Herein, the authors propose a novel scheme to design inexact computing architectures that selectively protects memory regions based on their significance, i.e. their impact on the end-to-end quality of service, as dictated by the bio-signal application characteristics. The authors illustrate their scheme on an industrial benchmark application performing the power spectrum analysis of electrocardiograms. Experimental evidence showcases that a significance-based memory protection approach leads to a small degradation in the output quality with respect to an exact implementation, while resulting in substantial energy gains, both in the memory and the processing subsystem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wearable devices performing advanced bio-signal analysis algorithms are aimed to foster a revolution in healthcare provision of chronic cardiac diseases. In this context, energy efficiency is of paramount importance, as long-term monitoring must be ensured while relying on a tiny power source. Operating at a scaled supply voltage, just above the threshold voltage, effectively helps in saving substantial energy, but it makes circuits, and especially memories, more prone to errors, threatening the correct execution of algorithms. The use of error detection and correction codes may help to protect the entire memory content, however it incurs in large area and energy overheads which may not be compatible with the tight energy budgets of wearable systems. To cope with this challenge, in this paper we propose to limit the overhead of traditional schemes by selectively detecting and correcting errors only in data highly impacting the end-to-end quality of service of ultra-low power wearable electrocardiogram (ECG) devices. This partition adopts the protection of either significant words or significant bits of each data element, according to the application characteristics (statistical properties of the data in the application buffers), and its impact in determining the output. The proposed heterogeneous error protection scheme in real ECG signals allows substantial energy savings (11% in wearable devices) compared to state-of-the-art approaches, like ECC, in which the whole memory is protected against errors. At the same time, it also results in negligible output quality degradation in the evaluated power spectrum analysis application of ECG signals.