1000 resultados para Sequential patterns


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the nature of the workloads and system demands created by users of the World Wide Web is crucial to properly designing and provisioning Web services. Previous measurements of Web client workloads have been shown to exhibit a number of characteristic features; however, it is not clear how those features may be changing with time. In this study we compare two measurements of Web client workloads separated in time by three years, both captured from the same computing facility at Boston University. The older dataset, obtained in 1995, is well-known in the research literature and has been the basis for a wide variety of studies. The newer dataset was captured in 1998 and is comparable in size to the older dataset. The new dataset has the drawback that the collection of users measured may no longer be representative of general Web users; however using it has the advantage that many comparisons can be drawn more clearly than would be possible using a new, different source of measurement. Our results fall into two categories. First we compare the statistical and distributional properties of Web requests across the two datasets. This serves to reinforce and deepen our understanding of the characteristic statistical properties of Web client requests. We find that the kinds of distributions that best describe document sizes have not changed between 1995 and 1998, although specific values of the distributional parameters are different. Second, we explore the question of how the observed differences in the properties of Web client requests, particularly the popularity and temporal locality properties, affect the potential for Web file caching in the network. We find that for the computing facility represented by our traces between 1995 and 1998, (1) the benefits of using size-based caching policies have diminished; and (2) the potential for caching requested files in the network has declined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this report, we extend our study of the intensity of mistreatment in distributed caching groups due to state interaction. In our earlier work (published as BUCS-TR-2006-003), we analytically showed how this type of mistreatment may appear under homogeneous demand distributions. We provided a simple setting where mistreatment due to state interaction may occur. According to this setting, one or more "overactive" nodes generate disproportionately more requests than the other nodes. In this report, we extend our experimental evaluation of the intensity of mistreatment to which non-overactive nodes are subjected, when the demand distributions are not homogeneous.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a thorough characterization of the access patterns in blogspace -- a fast-growing constituent of the content available through the Internet -- which comprises a rich interconnected web of blog postings and comments by an increasingly prominent user community that collectively define what has become known as the blogosphere. Our characterization of over 35 million read, write, and administrative requests spanning a 28-day period is done from three different blogosphere perspectives. The server view characterizes the aggregate access patterns of all users to all blogs; the user view characterizes how individual users interact with blogosphere objects (blogs); the object view characterizes how individual blogs are accessed. Our findings support two important conclusions. First, we show that the nature of interactions between users and objects is fundamentally different in blogspace than that observed in traditional web content. Access to objects in blogspace could be conceived as part of an interaction between an author and its readership. As we show in our work, such interactions range from one-to-many "broadcast-type" and many-to-one "registration-type" communication between an author and its readers, to multi-way, iterative "parlor-type" dialogues among members of an interest group. This more-interactive nature of the blogosphere leads to interesting traffic and communication patterns, which are different from those observed in traditional web content. Second, we identify and characterize novel features of the blogosphere workload, and we investigate the similarities and differences between typical web server workloads and blogosphere server workloads. Given the increasing share of blogspace traffic, understanding such differences is important for capacity planning and traffic engineering purposes, for example.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditory signals of speech are speaker-dependent, but representations of language meaning are speaker-independent. Such a transformation enables speech to be understood from different speakers. A neural model is presented that performs speaker normalization to generate a pitchindependent representation of speech sounds, while also preserving information about speaker identity. This speaker-invariant representation is categorized into unitized speech items, which input to sequential working memories whose distributed patterns can be categorized, or chunked, into syllable and word representations. The proposed model fits into an emerging model of auditory streaming and speech categorization. The auditory streaming and speaker normalization parts of the model both use multiple strip representations and asymmetric competitive circuits, thereby suggesting that these two circuits arose from similar neural designs. The normalized speech items are rapidly categorized and stably remembered by Adaptive Resonance Theory circuits. Simulations use synthesized steady-state vowels from the Peterson and Barney [J. Acoust. Soc. Am. 24, 175-184 (1952)] vowel database and achieve accuracy rates similar to those achieved by human listeners. These results are compared to behavioral data and other speaker normalization models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grid cells in the dorsal segment of the medial entorhinal cortex (dMEC) show remarkable hexagonal activity patterns, at multiple spatial scales, during spatial navigation. How these hexagonal patterns arise has excited intense interest. It has previously been shown how a selforganizing map can convert firing patterns across entorhinal grid cells into hippocampal place cells that are capable of representing much larger spatial scales. Can grid cell firing fields also arise during navigation through learning within a self-organizing map? A neural model is proposed that converts path integration signals into hexagonal grid cell patterns of multiple scales. This GRID model creates only grid cell patterns with the observed hexagonal structure, predicts how these hexagonal patterns can be learned from experience, and can process biologically plausible neural input and output signals during navigation. These results support a unified computational framework for explaining how entorhinal-hippocampal interactions support spatial navigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a self-organizing, real-time, hierarchical neural network model of sequential processing, and shows how it can be used to induce recognition codes corresponding to word categories and elementary grammatical structures. The model, first introduced in Mannes (1992), learns to recognize, store, and recall sequences of unitized patterns in a stable manner, either using short-term memory alone, or using long-term memory weights. Memory capacity is only limited by the number of nodes provided. Sequences are mapped to unitized patterns, making the model suitable for hierarchical operation. By using multiple modules arranged in a hierarchy and a simple mapping between output of lower levels and the input of higher levels, the induction of codes representing word category and simple phrase structures is an emergent property of the model. Simulation results are reported to illustrate this behavior.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Fuzzy ART system introduced herein incorporates computations from fuzzy set theory into ART 1. For example, the intersection (n) operator used in ART 1 learning is replaced by the MIN operator (A) of fuzzy set theory. Fuzzy ART reduces to ART 1 in response to binary input vectors, but can also learn stable categories in response to analog input vectors. In particular, the MIN operator reduces to the intersection operator in the binary case. Learning is stable because all adaptive weights can only decrease in time. A preprocessing step, called complement coding, uses on-cell and off-cell responses to prevent category proliferation. Complement coding normalizes input vectors while preserving the amplitudes of individual feature activations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Fuzzy ART model capable of rapid stable learning of recognition categories in response to arbitrary sequences of analog or binary input patterns is described. Fuzzy ART incorporates computations from fuzzy set theory into the ART 1 neural network, which learns to categorize only binary input patterns. The generalization to learning both analog and binary input patterns is achieved by replacing appearances of the intersection operator (n) in AHT 1 by the MIN operator (Λ) of fuzzy set theory. The MIN operator reduces to the intersection operator in the binary case. Category proliferation is prevented by normalizing input vectors at a preprocessing stage. A normalization procedure called complement coding leads to a symmetric theory in which the MIN operator (Λ) and the MAX operator (v) of fuzzy set theory play complementary roles. Complement coding uses on-cells and off-cells to represent the input pattern, and preserves individual feature amplitudes while normalizing the total on-cell/off-cell vector. Learning is stable because all adaptive weights can only decrease in time. Decreasing weights correspond to increasing sizes of category "boxes". Smaller vigilance values lead to larger category boxes. Learning stops when the input space is covered by boxes. With fast learning and a finite input set of arbitrary size and composition, learning stabilizes after just one presentation of each input pattern. A fast-commit slow-recode option combines fast learning with a forgetting rule that buffers system memory against noise. Using this option, rare events can be rapidly learned, yet previously learned memories are not rapidly erased in response to statistically unreliable input fluctuations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study is a cross-linguistic, cross-sectional investigation of the impact of learning contexts on the acquisition of sociopragmatic variation patterns and the subsequent enactment of compound identities. The informants are 20 non-native speaker teachers of English from a range of 10 European countries. They are all primarily mono-contextual foreign language learners/users of English: however, they differ with respect to the length of time accumulated in a target language environment. This allows for three groups to be established – those who have accumulated 60 days or less; those with between 90 days and one year and the final group, all of whom have accumulated in excess of one year. In order to foster the dismantling of the monolith of learning context, both learning contexts under consideration – i.e. the foreign language context and submersion context are broken down into micro-contexts which I refer to as loci of learning. For the purpose of this study, two loci are considered: the institutional and the conversational locus. In order to make a correlation between the impact of learning contexts and loci of learning on the acquisition of sociopragmatic variation patterns, a two-fold study is conducted. The first stage is the completion of a highly detailed language contact profile (LCP) questionnaire. This provides extensive biographical information regarding language learning history and is a powerful tool in illuminating the intensity of contact with the L2 that learners experience in both contexts as well as shedding light on the loci of learning to which learners are exposed in both contexts. Following the completion of the LCP, the informants take part in two role plays which require the enactment of differential identities when engaged in a speech event of asking for advice. The enactment of identities then undergoes a strategic and linguistic analysis in order to investigate if and how differences in the enactment of compound identities are indexed in language. Results indicate that learning context has a considerable impact not only on how identity is indexed in language, but also on the nature of identities enacted. Informants with very low levels of crosscontextuality index identity through strategic means – i.e. levels of directness and conventionality; however greater degrees of cross-contextuality give rise to the indexing of differential identities linguistically by means of speaker/hearer orientation and (non-) solidary moves. When it comes to the nature of identity enacted, it seems that more time spent in intense contact with native speakers in a range of loci of learning allows learners to enact their core identity; whereas low levels of contact with over-exposure to the institutional locus of learning fosters the enactment of generic identities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distribution of soft sediment benthic fauna and the environmental factors affecting them were studied, to investigate changes across spatial and temporal scales. Investigations took place at Lough Hyne Marine Reserve using a range of methods. Data on the sedimentation rates of organic and inorganic matter were collected at monthly intervals for one year at a number of sites around the Lough, by use of vertical midwater-column sediment traps. Sedimentation of these two fractions were not coupled; inorganic matter sedimentation depended on hydrodynamic and weather factors, while the organic matter sedimentation was more complex, being dependent on biological and chemical processes in the water column. The effects of regular hypoxic episodes on benthic fauna due to a natural seasonal thermocline were studied in the deep Western Trough, using camera-equipped remotely-operated vehicle to follow transects, on a three-monthly basis over one year. In late summer, the area below the thermocline of the Western Trough was devoid of visible fauna. Decapod crustaceans were the first taxon to make use of ameliorating oxygen conditions in autumn, by darting below the thermocline depth, most likely to scavenge. This was indicated by tracks that they left on the surface of the Trough floor. Some species, most noticeably Fries’ goby Lesueurigobius friesii, migrated below the thermocline depth when conditions were normoxic and established semi-permanent burrows. Their population encompassed all size classes, indicating that this habitat was not limited to juveniles of this territorial species. Recolonisation by macrofauna and burrowing megafauna was studied during normoxic conditions, from November 2009 to May 2010. Macrofauna displayed a typical post-disturbance pattern of recolonisation with one species, the polychaete Scalibregma inflatum, occurring at high abundance levels in March 2010. In May, this population had become significantly reduced and a more diverse community was established. The abundance of burrowing infauna comprising decapods crabs and Fries’ gobies, was estimated by identifying and counting their distinctive burrow structures. While above the summer thermocline depth, burrow abundance increased in a linear fashion, below the thermocline depth a slight reduction of burrow abundance occurred in May, when oxygen conditions deteriorated again. The majority of the burrows occurring in May were made by Fries’ gobies, which are thought to encounter low oxygen concentrations in their burrows. Reduction in burrow abundance of burrowing shrimps Calocaris macandreae and Callianassa subterranea (based on descriptions of burrow structures from the literature), from March to May, might be related to their reduced activity in hypoxia, leading to loss of structural burrow maintenance. Spatial and temporal changes to macrofaunal assemblage structures were studied seasonally for one year across 5 sites in the Lough and subject to multivariate statistical analysis. Assemblage structures were significantly correlated with organic matter levels in the sediment, the amounts of organic matter settling out of the water column one month before macrofaunal sampling took place as well as current speed and temperature. This study was the first to investigate patterns and processes in the Lough soft sediment ecology across all 3 basins on a temporal and spatial scale. An investigation into the oceanographic aspects of the development, behaviour and break-down of the summer thermocline of Lough Hyne was performed in collaboration with researchers from other Irish institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contemporary IT standards are designed, not selected. Their design enacts a complex process that brings together a coalition of players. We examine the design of the SOAP standard to discover activity patterns in this design process. The paper reports these patterns as a precursor to developing a micro-level process theory for designing IT standards.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Docetaxel is an active agent in the treatment of metastatic breast cancer. We evaluated the feasibility of docetaxel-based sequential and combination regimens as adjuvant therapies for patients with node-positive breast cancer. PATIENTS AND METHODS: Three consecutive groups of patients with node-positive breast cancer or locally-advanced disease, aged < or = 70 years, received one of the following regimens: a) sequential A-->T-->CMF: doxorubicin 75 mg/m2 q 3 weeks x 3, followed by docetaxel 100 mg/m2 q 3 weeks x 3, followed by i.v. CMF days 1 + 8 q 4 weeks x 3; b) sequential accelerated A-->T-->CMF: A and T were administered at the same doses q 2 weeks; c) combination therapy: doxorubicin 50 mg/m2 + docetaxel 75 mg/m2 q 3 weeks x 4, followed by CMF x 4. When indicated, radiotherapy was administered during or after CMF, and tamoxifen started after the end of CMF. RESULTS: Seventy-nine patients have been treated. Median age was 48 years. A 30% rate of early treatment discontinuation was observed in patients receiving the sequential accelerated therapy (23% during A-->T), due principally to severe skin toxicity. Median relative dose-intensity was 100% in the three treatment arms. The incidence of G3-G4 major toxicities by treated patients, was as follows: skin toxicity a: 5%; b: 27%; c: 0%; stomatitis a: 20%; b: 20%; c: 3%. The incidence of neutropenic fever was a: 30%; b: 13%; c: 48%. After a median follow-up of 18 months, no late toxicity has been reported. CONCLUSIONS: The accelerated sequential A-->T-->CMF treatment is not feasible due to an excess of skin toxicity. The sequential non accelerated and the combination regimens are feasible and under evaluation in a phase III trial of adjuvant therapy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Docetaxel has proven efficacy in metastatic breast cancer. In this pilot study, we explored the efficacy/feasibility of docetaxel-based sequential and combination regimens as adjuvant therapy of node-positive breast cancer. PATIENTS AND METHODS: From March 1996 till March 1998, four consecutive groups of patients with stages II and III breast cancer, aged < or = 70 years, received one of the following regimens: a) sequential Doxorubicin (A) --> Docetaxel (T) --> CMF (Cyclophosphamide+Methotrexate+5-Fluorouracil): A 75 mg/m q 3 wks x 3, followed by T100 mg/m2 q 3 wks x 3, followed by i.v. CMF Days 1+8 q 4 wks x 3; b) sequential accelerated A --> T --> CMF: A and T administered at the same doses q 2 wks with Lenograstin support; c) combination therapy: A 50 mg/m2 + T 75 mg/m2 q 3 wks x 4, followed by CMF x 4; d) sequential T --> A --> CMF: T and A, administered as in group a), with the reverse sequence. When indicated, radiotherapy was administered during or after CMF, and Tamoxifen after CMF. RESULTS: Ninety-three patients were treated. The median age was 48 years (29-66) and the median number of positive axillary nodes was 6 (1-25). Tumors were operable in 94% and locally advanced in 6% of cases. Pathological tumor size was >2 cm in 72% of cases. There were 21 relapses, (18 systemic, 3 locoregional) and 11 patients (12%) have died from disease progression. At median follow-up of 39 months (6-57), overall survival (OS) was 87% (95% CI, 79-94%) and disease-free survival (DFS) was 76% (95% CI, 67%-85%). CONCLUSION: The efficacy of these docetaxel-based regimens, in terms of OS and DFS, appears to be at least as good as standard anthracycline-based adjuvant chemotherapy (CT), in similar high-risk patient populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a methodology for detecting anomalies from sequentially observed and potentially noisy data. The proposed approach consists of two main elements: 1) filtering, or assigning a belief or likelihood to each successive measurement based upon our ability to predict it from previous noisy observations and 2) hedging, or flagging potential anomalies by comparing the current belief against a time-varying and data-adaptive threshold. The threshold is adjusted based on the available feedback from an end user. Our algorithms, which combine universal prediction with recent work on online convex programming, do not require computing posterior distributions given all current observations and involve simple primal-dual parameter updates. At the heart of the proposed approach lie exponential-family models which can be used in a wide variety of contexts and applications, and which yield methods that achieve sublinear per-round regret against both static and slowly varying product distributions with marginals drawn from the same exponential family. Moreover, the regret against static distributions coincides with the minimax value of the corresponding online strongly convex game. We also prove bounds on the number of mistakes made during the hedging step relative to the best offline choice of the threshold with access to all estimated beliefs and feedback signals. We validate the theory on synthetic data drawn from a time-varying distribution over binary vectors of high dimensionality, as well as on the Enron email dataset. © 1963-2012 IEEE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A popular way to account for unobserved heterogeneity is to assume that the data are drawn from a finite mixture distribution. A barrier to using finite mixture models is that parameters that could previously be estimated in stages must now be estimated jointly: using mixture distributions destroys any additive separability of the log-likelihood function. We show, however, that an extension of the EM algorithm reintroduces additive separability, thus allowing one to estimate parameters sequentially during each maximization step. In establishing this result, we develop a broad class of estimators for mixture models. Returning to the likelihood problem, we show that, relative to full information maximum likelihood, our sequential estimator can generate large computational savings with little loss of efficiency.