97 resultados para MPIX (Electronic computer system).


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistics-based Internet traffic classification using machine learning techniques has attracted extensive research interest lately, because of the increasing ineffectiveness of traditional port-based and payload-based approaches. In particular, unsupervised learning, that is, traffic clustering, is very important in real-life applications, where labeled training data are difficult to obtain and new patterns keep emerging. Although previous studies have applied some classic clustering algorithms such as K-Means and EM for the task, the quality of resultant traffic clusters was far from satisfactory. In order to improve the accuracy of traffic clustering, we propose a constrained clustering scheme that makes decisions with consideration of some background information in addition to the observed traffic statistics. Specifically, we make use of equivalence set constraints indicating that particular sets of flows are using the same application layer protocols, which can be efficiently inferred from packet headers according to the background knowledge of TCP/IP networking. We model the observed data and constraints using Gaussian mixture density and adapt an approximate algorithm for the maximum likelihood estimation of model parameters. Moreover, we study the effects of unsupervised feature discretization on traffic clustering by using a fundamental binning method. A number of real-world Internet traffic traces have been used in our evaluation, and the results show that the proposed approach not only improves the quality of traffic clusters in terms of overall accuracy and per-class metrics, but also speeds up the convergence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a subdivision-based vector graphics for image representation and creation. The graphics representation is a subdivision surface defined by a triangular mesh augmented with color attribute at vertices and feature attribute at edges. Special cubic B-splines are proposed to describe curvilinear features of an image. New subdivision rules are then designed accordingly, which are applied to the mesh and the color attribute to define the spatial distribution and piecewise-smoothly varying colors of the image. A sharpness factor is introduced to control the color transition across the curvilinear edges. In addition, an automatic algorithm is developed to convert a raster image into such a vector graphics representation. The algorithm first detects the curvilinear features of the image, then constructs a triangulation based on the curvilinear edges and feature attributes, and finally iteratively optimizes the vertex color attributes and updates the triangulation. Compared with existing vector-based image representations, the proposed representation and algorithm have the following advantages in addition to the common merits (such as editability and scalability): 1) they allow flexible mesh topology and handle images or objects with complicated boundaries or features effectively; 2) they are able to faithfully reconstruct curvilinear features, especially in modeling subtle shading effects around feature curves; and 3) they offer a simple way for the user to create images in a freehand style. The effectiveness of the proposed method has been demonstrated in experiments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Restraining the spread of rumors in online social networks (OSNs) has long been an important but difficult problem to be addressed. Currently, there are mainly two types of methods 1) blocking rumors at the most influential users or community bridges, or 2) spreading truths to clarify the rumors. Each method claims the better performance among all the others according to their own considerations and environments. However, there must be one standing out of the rest. In this paper, we focus on this part of work. The difficulty is that there does not exist a universal standard to evaluate them. In order to address this problem, we carry out a series of empirical and theoretical analysis on the basis of the introduced mathematical model. Based on this mathematical platform, each method will be evaluated by using real OSN data.We have done three types of analysis in this work. First, we compare all the measures of locating important users. The results suggest that the degree and betweenness measures outperform all the others in the Facebook network. Second, we analyze the method of the truth clarification method, and find that this method has a long-term performance while the degree measure performs well only in the early stage. Third, in order to leverage these two methods, we further explore the strategy of different methods working together and their equivalence. Given a fixed budget in the real world, our analysis provides a potential solution to find out a better strategy by integrating both types of methods together. From both the academic and technical perspective, the work in this paper is an important step towards the most practical and optimal strategies of restraining rumors in OSNs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multicast is an important mechanism in modern wireless networks and has attracted significant efforts to improve its performance with different metrics including throughput, delay, energy efficiency, etc. Traditionally, an ideal loss-free channel model is widely used to facilitate routing protocol design. However, the quality of wireless links is affected or even jeopardized resulting in transmission failures by many factors like collisions, fading or the noise of environment. In this paper, we propose a reliable multicast protocol, called CodePipe, with energy-efficiency, high throughput and fairness in lossy wireless networks. Building upon opportunistic routing and random linear network coding, CodePipe can not only eliminate coordination between nodes, but also improve the multicast throughput significantly by exploiting both intra-batch and inter-batch coding opportunities. In particular, four key techniques, namely, LP-based opportunistic routing structure, opportunistic feeding, fast batch moving and inter-batch coding, are proposed to offer significant improvement in throughput, energy-efficiency and fairness.Moreover, we design an efficient online extension of CodePipe such that it can work in a dynamic network where nodes join and leave the network as time progresses. We evaluate CodePipe on ns2 simulator by comparing with other two state-of-art multicast protocols,MORE and Pacifier. Simulation results show that CodePipe significantly outperforms both of them.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Thousands of children are living with advanced cancer; yet patient-reported outcomes (PROs) have rarely been used to describe their experiences. We aimed to describe symptom distress in 104 children age 2 years or older with advanced cancer enrolled onto the Pediatric Quality of Life and Evaluation of Symptoms Technology (PediQUEST) Study (multisite clinical trial evaluating an electronic PRO system).

METHODS: Symptom data were collected using age- and respondent-adapted versions of the PediQUEST Memorial Symptom Assessment Scale (PQ-MSAS) at most once per week. Clinical and treatment data were obtained from medical records. Individual symptom scores were dichotomized into high/low distress. Determinants of PQ-MSAS scores were explored using linear mixed-effects models.

RESULTS: During 9 months of follow-up, PQ-MSAS was administered 920 times: 459 times in teens (99% self-report), 249 times in children ages 7 to 12 years (96% child/parent report), and 212 times in those ages 2 to 6 years (parent reports). Common symptoms included pain (48%), fatigue (46%), drowsiness (39%), and irritability (37%); most scores indicated high distress. Among the 73 PQ-MSAS surveys administered in the last 12 weeks of life, pain was highly prevalent (62%; 58% with high distress). Being female, having a brain tumor, experiencing recent disease progression, and receiving moderate- or high-intensity cancer-directed therapy in the prior 10 days were associated with worse PQ-MSAS scores. In the final 12 weeks of life, receiving mild cancer-directed therapy was associated with improved psychological PQ-MSAS scores.

CONCLUSION: Children with advanced cancer experience high symptom distress. Strategies to promote intensive symptom management are indicated, especially with disease progression or administration of intensive treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Actual and perceived object control (commonly ball) skill proficiency is associated with higher physical activity in children and adolescents. Active video games (AVGs) encourage whole body movement to control/play the electronic gaming system and therefore provide an opportunity for screen time to become more active. The purpose of this study was to determine whether playing sports AVGs has a positive influence on young children's actual and perceived object control skills. DESIGN: Two group pre/post experimental design study. METHODS: Thirty-six children aged 6-10 years old from one school were randomly allocated to a control or intervention condition. The Test of Gross Motor Development-3 assessed object control skill. The Pictorial Scale of Perceived Competence for Young Children assessed perceived object control skill. The intervention consisted of 6×50min lunchtime AVG sessions on the Xbox Kinect. Two to three sport games were chosen for participants to play each session. General linear models with either perceived object control or actual object control skill as the outcome variables were conducted. Each base model adjusted for intervention status and pre-score of the respective outcome variable. Additional models adjusted for potential confounding variables (sex of child and game at home). RESULTS: No significant differences between the control and intervention groups were observed for both outcomes. CONCLUSIONS: This study found that playing the Xbox Kinect does not significantly influence children's perceived or actual object control skills, suggesting that the utility of the Xbox Kinect for developing perceived and actual object control skill competence is questionable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Radio Frequency Identification (RFID) is a technology that has been deployed successfully for asset tracking within hospitals aimed at improving the quality of processes. In the Australian hospitals context however, adoption of this technology seem sporadic. This research reports on a long-term investigation to gain a deeper understanding of the socio-technical factors involved in the adoption of RFID in Australian hospitals. The research was conducted using interpretive multiple case methodology and results analyzed through the Actor-Network Theoretical (ANT) Lens. © 2013 Infonomics Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new portfolio risk measure that is the uncertainty of portfolio fuzzy return is introduced in this paper. Beyond the well-known Sharpe ratio (i.e., the reward-to-variability ratio) in modern portfolio theory, we initiate the so-called fuzzy Sharpe ratio in the fuzzy modeling context. In addition to the introduction of the new risk measure, we also put forward the reward-to-uncertainty ratio to assess the portfolio performance in fuzzy modeling. Corresponding to two approaches based on TM and TW fuzzy arithmetic, two portfolio optimization models are formulated in which the uncertainty of portfolio fuzzy returns is minimized, while the fuzzy Sharpe ratio is maximized. These models are solved by the fuzzy approach or by the genetic algorithm (GA). Solutions of the two proposed models are shown to be dominant in terms of portfolio return uncertainty compared with those of the conventional mean-variance optimization (MVO) model used prevalently in the financial literature. In terms of portfolio performance evaluated by the fuzzy Sharpe ratio and the reward-to-uncertainty ratio, the model using TW fuzzy arithmetic results in higher performance portfolios than those obtained by both the MVO and the fuzzy model, which employs TM fuzzy arithmetic. We also find that using the fuzzy approach for solving multiobjective problems appears to achieve more optimal solutions than using GA, although GA can offer a series of well-diversified portfolio solutions diagrammed in a Pareto frontier.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This brief proposes an efficient technique for the construction of optimized prediction intervals (PIs) by using the bootstrap technique. The method employs an innovative PI-based cost function in the training of neural networks (NNs) used for estimation of the target variance in the bootstrap method. An optimization algorithm is developed for minimization of the cost function and adjustment of NN parameters. The performance of the optimized bootstrap method is examined for seven synthetic and real-world case studies. It is shown that application of the proposed method improves the quality of constructed PIs by more than 28% over the existing technique, leading to narrower PIs with a coverage probability greater than the nominal confidence level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: gait analysis is a recommended geriatric assessment for falls risk and sarcopenia; however, previous research utilises measurements at a single time point only. It is presently unclear how changes in gait over several years influence risk of recurrent falls in older adults.Methods: we investigated 135 female volunteers (mean age ± SD: 76.7 ± 5.0 years; range: 70-92 years) at high risk of fracture. Gait parameters (speed, cadence, step length, step width, swing time and double support phase) were assessed using the GAITRite Electronic Walkway System at four annual clinics over ?3.7 ± 0.5 years. Participants reported incident falls monthly for 3.7 ± 1.2 years.Results: increasing gait speed (odds ratio: 0.96; 95% confidence interval 0.93, 0.99) and step length (0.87; 0.77, 0.98) from baseline to final follow-up was associated with reduced likelihood of being a recurrent faller over the study period. No significant associations were observed for baseline gait parameters (all P ≥ 0.05). At the second follow-up (2.8 ± 0.6 years), an increase in swing time (0.65; 0.43, 0.98) was associated with reduced likelihood, while an increase in double support phase (1.31; 1.04, 1.66) was associated with increased likelihood, for being a recurrent faller in the subsequent 1.3 years following this time point.Conclusion: changes in gait parameters over several years are significantly associated with the likelihood of being a recurrent faller among community-dwelling older women at high risk of fracture. Further research is required to develop gait monitoring guidelines and gait parameter decline cut points that may be utilised by clinicians to identify older adults at risk of incident falls and sarcopenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many network applications, the nature of traffic is of burst type. Often, the transient response of network to such traffics is the result of a series of interdependant events whose occurrence prediction is not a trivial task. The previous efforts in IEEE 802.15.4 networks often followed top-down approaches to model those sequences of events, i.e., through making top-view models of the whole network, they tried to track the transient response of network to burst packet arrivals. The problem with such approaches was that they were unable to give station-level views of network response and were usually complex. In this paper, we propose a non-stationary analytical model for the IEEE 802.15.4 slotted CSMA/CA medium access control (MAC) protocol under burst traffic arrival assumption and without the optional acknowledgements. We develop a station-level stochastic time-domain method from which the network-level metrics are extracted. Our bottom-up approach makes finding station-level details such as delay, collision and failure distributions possible. Moreover, network-level metrics like the average packet loss or transmission success rate can be extracted from the model. Compared to the previous models, our model is proven to be of lower memory and computational complexity order and also supports contention window sizes of greater than one. We have carried out extensive and comparative simulations to show the high accuracy of our model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Online social networks (OSN) have become one of the major platforms for people to exchange information. Both positive information (e.g., ideas, news and opinions) and negative information (e.g., rumors and gossips) spreading in social media can greatly influence our lives. Previously, researchers have proposed models to understand their propagation dynamics. However, those were merely simulations in nature and only focused on the spread of one type of information. Due to the human-related factors involved, simultaneous spread of negative and positive information cannot be thought of the superposition of two independent propagations. In order to fix these deficiencies, we propose an analytical model which is built stochastically from a node level up. It can present the temporal dynamics of spread such as the time people check newly arrived messages or forward them. Moreover, it is capable of capturing people's behavioral differences in preferring what to believe or disbelieve. We studied the social parameters impact on propagation using this model. We found that some factors such as people's preference and the injection time of the opposing information are critical to the propagation but some others such as the hearsay forwarding intention have little impact on it. The extensive simulations conducted on the real topologies confirm the high accuracy of our model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software-Defined Network (SDN) is a promising network paradigm that separates the control plane and data plane in the network. It has shown great advantages in simplifying network management such that new functions can be easily supported without physical access to the network switches. However, Ternary Content Addressable Memory (TCAM), as a critical hardware storing rules for high-speed packet processing in SDN-enabled devices, can be supplied to each device with very limited quantity because it is expensive and energy-consuming. To efficiently use TCAM resources, we propose a rule multiplexing scheme, in which the same set of rules deployed on each node apply to the whole flow of a session going through but towards different paths. Based on this scheme, we study the rule placement problem with the objective of minimizing rule space occupation for multiple unicast sessions under QoS constraints. We formulate the optimization problem jointly considering routing engineering and rule placement under both existing and our rule multiplexing schemes. Via an extensive review of the state-of-the-art work, to the best of our knowledge, we are the first to study the non-routing-rule placement problem. Finally, extensive simulations are conducted to show that our proposals significantly outperform existing solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the explosion of big data, processing large numbers of continuous data streams, i.e., big data stream processing (BDSP), has become a crucial requirement for many scientific and industrial applications in recent years. By offering a pool of computation, communication and storage resources, public clouds, like Amazon's EC2, are undoubtedly the most efficient platforms to meet the ever-growing needs of BDSP. Public cloud service providers usually operate a number of geo-distributed datacenters across the globe. Different datacenter pairs are with different inter-datacenter network costs charged by Internet Service Providers (ISPs). While, inter-datacenter traffic in BDSP constitutes a large portion of a cloud provider's traffic demand over the Internet and incurs substantial communication cost, which may even become the dominant operational expenditure factor. As the datacenter resources are provided in a virtualized way, the virtual machines (VMs) for stream processing tasks can be freely deployed onto any datacenters, provided that the Service Level Agreement (SLA, e.g., quality-of-information) is obeyed. This raises the opportunity, but also a challenge, to explore the inter-datacenter network cost diversities to optimize both VM placement and load balancing towards network cost minimization with guaranteed SLA. In this paper, we first propose a general modeling framework that describes all representative inter-task relationship semantics in BDSP. Based on our novel framework, we then formulate the communication cost minimization problem for BDSP into a mixed-integer linear programming (MILP) problem and prove it to be NP-hard. We then propose a computation-efficient solution based on MILP. The high efficiency of our proposal is validated by extensive simulation based studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a convex geometry (CG)-based method for blind separation of nonnegative sources. First, the unaccessible source matrix is normalized to be column-sum-to-one by mapping the available observation matrix. Then, its zero-samples are found by searching the facets of the convex hull spanned by the mapped observations. Considering these zero-samples, a quadratic cost function with respect to each row of the unmixing matrix, together with a linear constraint in relation to the involved variables, is proposed. Upon which, an algorithm is presented to estimate the unmixing matrix by solving a classical convex optimization problem. Unlike the traditional blind source separation (BSS) methods, the CG-based method does not require the independence assumption, nor the uncorrelation assumption. Compared with the BSS methods that are specifically designed to distinguish between nonnegative sources, the proposed method requires a weaker sparsity condition. Provided simulation results illustrate the performance of our method.