828 resultados para Innovations in web technology
Resumo:
Two silicon light emitting devices with different structures are realized in standard 0.35 mu m complementary metal-oxide-semiconductor (CMOS) technology. They operate in reverse breakdown mode and can be turned on at 8.3 V. Output optical powers of 13.6 nW and 12.1 nW are measured at 10 V and 100 mA, respectively, and both the calculated light emission intensities are more than 1 mW/Cm-2. The optical spectra of the two devices are between 600-790 nm with a clear peak near 760 nm..
Resumo:
This paper proposes a novel loadless 4T SRAM cell composed of nMOS transistors. The SRAM cell is based on 32nm silicon-on-insulator (SO1) technology node. It consists of two access transistors and two pull-down transistors. The pull-down transistors have larger channel length than the access transistors. Due to the significant short channel effect of small-size MOS transistors, the access transistors have much larger leakage current than the pull-down transistors,enabling the SRAM cell to maintain logic "1" while in standby. The storage node voltages of the cell are fed back to the back-gates of the access transistors,enabling the stable "read" operation of the cell. The use of back-gate feedback also helps to im- prove the static noise margin (SNM) of the cell. The proposed SRAM cell has smaller area than conventional bulk 6T SRAM cells and 4T SRAM cells. The speed and power dissipation of the SRAM cell are simulated and discussed. The SRAM cell can operate with a 0. 5V supply voltage.
Resumo:
Understanding the nature of the workloads and system demands created by users of the World Wide Web is crucial to properly designing and provisioning Web services. Previous measurements of Web client workloads have been shown to exhibit a number of characteristic features; however, it is not clear how those features may be changing with time. In this study we compare two measurements of Web client workloads separated in time by three years, both captured from the same computing facility at Boston University. The older dataset, obtained in 1995, is well-known in the research literature and has been the basis for a wide variety of studies. The newer dataset was captured in 1998 and is comparable in size to the older dataset. The new dataset has the drawback that the collection of users measured may no longer be representative of general Web users; however using it has the advantage that many comparisons can be drawn more clearly than would be possible using a new, different source of measurement. Our results fall into two categories. First we compare the statistical and distributional properties of Web requests across the two datasets. This serves to reinforce and deepen our understanding of the characteristic statistical properties of Web client requests. We find that the kinds of distributions that best describe document sizes have not changed between 1995 and 1998, although specific values of the distributional parameters are different. Second, we explore the question of how the observed differences in the properties of Web client requests, particularly the popularity and temporal locality properties, affect the potential for Web file caching in the network. We find that for the computing facility represented by our traces between 1995 and 1998, (1) the benefits of using size-based caching policies have diminished; and (2) the potential for caching requested files in the network has declined.
Resumo:
Under high loads, a Web server may be servicing many hundreds of connections concurrently. In traditional Web servers, the question of the order in which concurrent connections are serviced has been left to the operating system. In this paper we ask whether servers might provide better service by using non-traditional service ordering. In particular, for the case when a Web server is serving static files, we examine the costs and benefits of a policy that gives preferential service to short connections. We start by assessing the scheduling behavior of a commonly used server (Apache running on Linux) with respect to connection size and show that it does not appear to provide preferential service to short connections. We then examine the potential performance improvements of a policy that does favor short connections (shortest-connection-first). We show that mean response time can be improved by factors of four or five under shortest-connection-first, as compared to an (Apache-like) size-independent policy. Finally we assess the costs of shortest-connection-first scheduling in terms of unfairness (i.e., the degree to which long connections suffer). We show that under shortest-connection-first scheduling, long connections pay very little penalty. This surprising result can be understood as a consequence of heavy-tailed Web server workloads, in which most connections are small, but most server load is due to the few large connections. We support this explanation using analysis.
Resumo:
Temporal locality of reference in Web request streams emerges from two distinct phenomena: the popularity of Web objects and the {\em temporal correlation} of requests. Capturing these two elements of temporal locality is important because it enables cache replacement policies to adjust how they capitalize on temporal locality based on the relative prevalence of these phenomena. In this paper, we show that temporal locality metrics proposed in the literature are unable to delineate between these two sources of temporal locality. In particular, we show that the commonly-used distribution of reference interarrival times is predominantly determined by the power law governing the popularity of documents in a request stream. To capture (and more importantly quantify) both sources of temporal locality in a request stream, we propose a new and robust metric that enables accurate delineation between locality due to popularity and that due to temporal correlation. Using this metric, we characterize the locality of reference in a number of representative proxy cache traces. Our findings show that there are measurable differences between the degrees (and sources) of temporal locality across these traces, and that these differences are effectively captured using our proposed metric. We illustrate the significance of our findings by summarizing the performance of a novel Web cache replacement policy---called GreedyDual*---which exploits both long-term popularity and short-term temporal correlation in an adaptive fashion. Our trace-driven simulation experiments (which are detailed in an accompanying Technical Report) show the superior performance of GreedyDual* when compared to other Web cache replacement policies.
Resumo:
The relative importance of long-term popularity and short-term temporal correlation of references for Web cache replacement policies has not been studied thoroughly. This is partially due to the lack of accurate characterization of temporal locality that enables the identification of the relative strengths of these two sources of temporal locality in a reference stream. In [21], we have proposed such a metric and have shown that Web reference streams differ significantly in the prevalence of these two sources of temporal locality. These finding underscore the importance of a Web caching strategy that can adapt in a dynamic fashion to the prevalence of these two sources of temporal locality. In this paper, we propose a novel cache replacement algorithm, GreedyDual*, which is a generalization of GreedyDual-Size. GreedyDual* uses the metrics proposed in [21] to adjust the relative worth of long-term popularity versus short-term temporal correlation of references. Our trace-driven simulation experiments show the superior performance of GreedyDual* when compared to other Web cache replacement policies proposed in the literature.
Resumo:
French chamber music in the last quarter of the nineteenth century displayed significant advances in musical innovations and technical developments. As the Parisian public began to favor instrumental music and mélodie over opera, vocal and chamber music with piano became one of the main genres to express French composers’ creativity and individuality. The composers Franck, Debussy, Fauré, Duparc, Ravel, Chausson and Poulenc were the major contributors to this unusually creative period in French music. French mélodies of this period blend precision with lyricism, and demand the performer’s elegance and wit. They show careful settings of the French language’s rhythmic subtleties and increased expressiveness in and importance of the piano accompaniment. The chamber works of this period demanded superior pianistic and instrumental virtuosity while displaying wide ranges of sonority, multiple tone colors, and rhythmic fluidity. The three recitals which comprise this dissertation project were performed at the University of Maryland Gildenhorn Recital Hall on 27 October 2006, All Nations Mission Church (Dayton, NJ) on 5 December 2009, and the Leah M. Smith Lecture Hall of the University of Maryland on 11 May 2010. The repertoire included Poulenc’s Sonata for Oboe and Piano (1962) with oboist Yeongsu Kim, French mélodies by Fauré, Chausson, Debussy, Ravel and Duparc with soprano Jung-A Lee and baritone Hyun-Oh Shin, Poulenc’s Sextet for Piano, Flute, Oboe, Clarinet, Bassoon and Horn (1932-1939) with flutist Katrina Smith, clarinetist Jihoon Chang, bassoonist Erich Heckscher, hornist Heidi Littman and oboist Yeongsu Kim, Debussy’s Sonata for Cello and Piano (1915) with cellist Ji-Sook Shin, Poulenc’s Sonata for Violin and Piano (1942-1949) with violinist Ji-Hee Lim, Franck’s Sonata for Violin and Piano (1886) with violinist Na-Young Cho, Ravel’s Piano Trio (1915) with cellist Ji-Sook Shin and violinist Yu-Jeong Lee and Ravel’s Sonata for Violin and Piano (1927) with violinist Yu-Jeong Lee. The recitals were recorded on compact discs and are archived within the Digital Repository at the University of Maryland (DRUM).
Resumo:
Purpose – To present key challenges associated with the evolution of system-in-package technologies and present technical work in reliability modeling and embedded test that contributes to these challenges. Design/methodology/approach – Key challenges have been identified from the electronics and integrated MEMS industrial sectors. Solutions to optimising the reliability of a typical assembly process and reducing the cost of production test have been studied through simulation and modelling studies based on technology data released by NXP and in collaboration with EDA tool vendors Coventor and Flomerics. Findings – Characterised models that deliver special and material dependent reliability data that can be used to optimize robustness of SiP assemblies together with results that indicate relative contributions of various structural variables. An initial analytical model for solder ball reliability and a solution for embedding a low cost test for a capacitive RF-MEMS switch identified as an SiP component presenting a key test challenge. Research limitations/implications – Results will contribute to the further development of NXP wafer level system-in-package technology. Limitations are that feedback on the implementation of recommendations and the physical characterisation of the embedded test solution. Originality/value – Both the methodology and associated studies on the structural reliability of an industrial SiP technology are unique. The analytical model for solder ball life is new as is the embedded test solution for the RF-MEMS switch.
Resumo:
The sense of place that relates human beings to their environment is under threat from the rising tide of placelessness which can result from potentially positive forces such as urban regeneration as well as negative ones such as incremental degradation. The concept of sense of place, and the need to protect and enhance special places, has underpinned UK conservation legislation and policy in the post-war era. In Northern Ireland, due to its distinctive settlement tradition, its troubled political circumstances and its centralised administrative system, a unique hierarchy of special places has evolved, involving areas of townscape and village character as well as conventional conservation areas. For the first time a comprehensive comparative survey of the townscape quality of most of these areas has been carried out in order to test the hypothesis that too many conservation area designations may devalue the conservation coinage. It also assesses the contribution that areas of townscape character can make in this situation, as potential conservation areas or as second-level local amenity designations. Its findings support the initial hypothesis: assessment of townscape quality on the basis of consistent criteria demonstrates a decline in the quality of more recent conservation area designations, and hence some devaluation of the coinage. However, the need for local discretion in the protection of local amenity supports the concept of areas of townscape and village character as an additional and distinct designation. This contradicts recent policy recommendations from the Northern Ireland Planning Commission and contains valuable lessons for conservation policy and practice in other parts of the UK.