984 resultados para file
Resumo:
We propose a new technique for efficiently delivering popular content from information repositories with bounded file caches. Our strategy relies on the use of fast erasure codes (a.k.a. forward error correcting codes) to generate encodings of popular files, of which only a small sliding window is cached at any time instant, even to satisfy an unbounded number of asynchronous requests for the file. Our approach capitalizes on concurrency to maximize sharing of state across different request threads while minimizing cache memory utilization. Additional reduction in resource requirements arises from providing for a lightweight version of the network stack. In this paper, we describe the design and implementation of our Cyclone server as a Linux kernel subsystem.
Resumo:
The purpose of this project is the creation of a graphical "programming" interface for a sensor network tasking language called STEP. The graphical interface allows the user to specify a program execution graphically from an extensible pallet of functionalities and save the results as a properly formatted STEP file. Moreover, the software is able to load a file in STEP format and convert it into the corresponding graphical representation. During both phases a type-checker is running on the background to ensure that both the graphical representation and the STEP file are syntactically correct. This project has been motivated by the Sensorium project at Boston University. In this technical report we present the basic features of the software, the process that has been followed during the design and implementation. Finally, we describe the approach used to test and validate our software.
Resumo:
Emerging configurable infrastructures such as large-scale overlays and grids, distributed testbeds, and sensor networks comprise diverse sets of available computing resources (e.g., CPU and OS capabilities and memory constraints) and network conditions (e.g., link delay, bandwidth, loss rate, and jitter) whose characteristics are both complex and time-varying. At the same time, distributed applications to be deployed on these infrastructures exhibit increasingly complex constraints and requirements on resources they wish to utilize. Examples include selecting nodes and links to schedule an overlay multicast file transfer across the Grid, or embedding a network experiment with specific resource constraints in a distributed testbed such as PlanetLab. Thus, a common problem facing the efficient deployment of distributed applications on these infrastructures is that of "mapping" application-level requirements onto the network in such a manner that the requirements of the application are realized, assuming that the underlying characteristics of the network are known. We refer to this problem as the network embedding problem. In this paper, we propose a new approach to tackle this combinatorially-hard problem. Thanks to a number of heuristics, our approach greatly improves performance and scalability over previously existing techniques. It does so by pruning large portions of the search space without overlooking any valid embedding. We present a construction that allows a compact representation of candidate embeddings, which is maintained by carefully controlling the order via which candidate mappings are inserted and invalid mappings are removed. We present an implementation of our proposed technique, which we call NETEMBED – a service that identify feasible mappings of a virtual network configuration (the query network) to an existing real infrastructure or testbed (the hosting network). We present results of extensive performance evaluation experiments of NETEMBED using several combinations of real and synthetic network topologies. Our results show that our NETEMBED service is quite effective in identifying one (or all) possible embeddings for quite sizable queries and hosting networks – much larger than what any of the existing techniques or services are able to handle.
Resumo:
A model for representing music scores in a form suitable for general processing by a music-analyst-programmer is proposed and implemented. Typical input to the model consists of one or more pieces of music which are encoded in a file-based score representation. File-based representations are in a form unsuited for general processing, as they do not provide a suitable level of abstraction for a programmer-analyst. Instead, a representation is created giving a programmer's view of the score. This frees the analyst-programmer from implementation details, that otherwise would form a substantial barrier to progress. The score representation uses an object-oriented approach to create a natural and robust software environment for the musicologist. The system is used to explore ways in which it could benefit musicologists. Methodologies for analysing music corpora are presented in a series of analytic examples which illustrate some of the potential of this model. Proving hypotheses or performing analysis on corpora involves the construction of algorithms. Some unique aspects of using this score model for corpus-based musicology are: - Algorithms impose a discipline which arises from the necessity for formalism. - Automatic analysis enables musicologists to complete tasks that otherwise would be infeasible because of limitations of their energy, attentiveness, accuracy and time.
Resumo:
Is é a chuirim romham a dhéanamh sa tráchtas seo ná eagar a chur ar shaothar liteartha Mhíchíl Coimín; file agus údar próis a bhí ag saothrú na litríochta i gCill Chorcoráin, Contae an Chláir san ochtú haois déag. D’éag sé sa bhliain 1760 nuair a bhí sé beagnach 90 bliain d’aois. Is féidir a rá go bhfuil an Coimíneach, ó thaobh chanóin litríocht na Gaeilge de i measc na mionscríbhneoirí, agus níl aon dabht faoi ach go bhfuil an-chuid de léitheoirí na Gaeilge sa lá atá inniu ann dall ar a chuid scríbhneoireachta. Tá cáil air mar údar ‘Laoi Oisín i dTír na n-Óg’ ach mar a léireofar sa tráchtas seo, tá gach cuma ar an scéal nárbh é a scríobh. Tá againn óna pheann dornán beag dánta agus dhá scéal rómánsaíochta (‘Eachtra Thoroilbh Mhic Stairn’ agus ‘Eachtra a Thriúr Mac’) a scríobh sé nuair a bhí an traidisiún sin próis ar an dé deiridh. Níl aon chuid dá shaothar ar fáil in eagráin a shásódh léitheoirí an lae inniu ná na critéir scolártha atá i bhfeidhm anois. Níor tháinig aon lámhscríbhinn, i lámh an Choimínigh, anuas chugainn agus dá bhrí sin bhí dúshlán áirithe ag baint leis an bpróiseas eagarthóireachta. Sa tráchtas rinne mé an suirbhé is iomláine go dtí seo ar a shaothar i dtraidisiún na lámhscríbhinní agus ar na scríobhaithe a rinne é a sheachadadh. Bhí sé mar aidhm agam teacht ar na lámhscríbhinní is údarásaí sa traidisiún d’fhonn eagráin a réiteach a bheadh dílis dá bhunshaothar. Chomh maith leis sin tabharfar cuntas ar a bheatha agus ar a chúlra liteartha, agus déanfar iniúchadh criticiúil ar a shaothar próis agus fileata.
Resumo:
This paper describes implementations of two mobile cloud applications, file synchronisation and intensive data processing, using the Context Aware Mobile Cloud Services middleware, and the Cloud Personal Assistant. Both are part of the same mobile cloud project, actively developed and currently at the second version. We describe recent changes to the middleware, along with our experimental results of the two application models. We discuss challenges faced during the development of the middleware and their implications. The paper includes performance analysis of the CPA support for the two applications in respect to existing solutions.
Resumo:
This research investigates some of the reasons for the reported difficulties experienced by writers when using editing software designed for structured documents. The overall objective was to determine if there are aspects of the software interfaces which militate against optimal document construction by writers who are not computer experts, and to suggest possible remedies. Studies were undertaken to explore the nature and extent of the difficulties, and to identify which components of the software interfaces are involved. A model of a revised user interface was tested, and some possible adaptations to the interface are proposed which may help overcome the difficulties. The methodology comprised: 1. identification and description of the nature of a ‘structured document’ and what distinguishes it from other types of document used on computers; 2. isolation of the requirements of users of such documents, and the construction a set of personas which describe them; 3. evaluation of other work on the interaction between humans and computers, specifically in software for creating and editing structured documents; 4. estimation of the levels of adoption of the available software for editing structured documents and the reactions of existing users to it, with specific reference to difficulties encountered in using it; 5. examination of the software and identification of any mismatches between the expectations of users and the facilities provided by the software; 6. assessment of any physical or psychological factors in the reported difficulties experienced, and to determine what (if any) changes to the software might affect these. The conclusions are that seven of the twelve modifications tested could contribute to an improvement in usability, effectiveness, and efficiency when writing structured text (new document selection; adding new sections and new lists; identifying key information typographically; the creation of cross-references and bibliographic references; and the inclusion of parts of other documents). The remaining five were seen as more applicable to editing existing material than authoring new text (adding new elements; splitting and joining elements [before and after]; and moving block text).
Resumo:
Background: Many European countries including Ireland lack high quality, on-going, population based estimates of maternal behaviours and experiences during pregnancy. PRAMS is a CDC surveillance program which was established in the United States in 1987 to generate high quality, population based data to reduce infant mortality rates and improve maternal and infant health. PRAMS is the only on-going population based surveillance system of maternal behaviours and experiences that occur before, during and after pregnancy worldwide.Methods: The objective of this study was to adapt, test and evaluate a modified CDC PRAMS methodology in Ireland. The birth certificate file which is the standard approach to sampling for PRAMS in the United States was not available for the PRAMS Ireland study. Consequently, delivery record books for the period between 3 and 5 months before the study start date at a large urban obstetric hospital [8,900 births per year] were used to randomly sample 124 women. Name, address, maternal age, infant sex, gestational age at delivery, delivery method, APGAR score and birth weight were manually extracted from records. Stillbirths and early neonatal deaths were excluded using APGAR scores and hospital records. Women were sent a letter of invitation to participate including option to opt out, followed by a modified PRAMS survey, a reminder letter and a final survey.Results: The response rate for the pilot was 67%. Two per cent of women refused the survey, 7% opted out of the study and 24% did not respond. Survey items were at least 88% complete for all 82 respondents. Prevalence estimates of socially undesirable behaviours such as alcohol consumption during pregnancy were high [>50%] and comparable with international estimates.Conclusion: PRAMS is a feasible and valid method of collecting information on maternal experiences and behaviours during pregnancy in Ireland. PRAMS may offer a potential solution to data deficits in maternal health behaviour indicators in Ireland with further work. This study is important to researchers in Europe and elsewhere who may be interested in new ways of tailoring an established CDC methodology to their unique settings to resolve data deficits in maternal health.
Resumo:
Vietnam launched its first-ever stock market, named as Ho Chi Minh City Securities Trading Center (HSTC) on July 20, 2000. This is one of pioneering works on HSTC, which finds empirical evidences for the following: Anomalies of the HSTC stock returns through clusters of limit-hits, limit-hit sequences; Strong herd effect toward extreme positive returns of the market portfolio;The specification of ARMA-GARCH helps capture fairly well issues such as serial correlations and fat-tailed for the stabilized period. By using further information and policy dummy variables, it is justifiable that policy decisions on technicalities of trading can have influential impacts on the move of risk level, through conditional variance behaviors of HSTC stock returns. Policies on trading and disclosure practices have had profound impacts on Vietnam Stock Market (VSM). The over-using of policy tools can harm the market and investing mentality. Price limits become increasingly irrelevant and prevent the market from self-adjusting to equilibrium. These results on VSM have not been reported before in the literature on Vietnam’s financial markets. Given the policy implications, we suggest that the Vietnamese authorities re-think the use of price limit and give more freedom to market participants.
Resumo:
In this work we revisit the problem of the hedging of contingent claim using mean-square criterion. We prove that in incomplete market, some probability measure can be identified so that becomes -martingale under .This is in fact a new proposition on the martingale representation theorem. The new results also identify a weight function that serves to be an approximation to the Radon-Nikodým derivative of the unique neutral martingale measure.
Resumo:
This study focuses on those substantial changes that characterize the shift of Vietnam’s macroeconomic structures and evolution of micro-structural interaction over an important period of 1991-2008. The results show that these events are completely distinct in terms of (i) Economic nature; (ii) Scale and depth of changes; (iii) Start and end results; and, (iv) Requirement for macroeconomic decisions. The study rejected a suspicion of similarity between the contagion of the Asian financial crisis in 1997-98 and economic chaos in the first half of 2008 (starting from late 2007). The depth, economic settings of, and interconnection between macro choices and micro decisions have all grown up significantly, partly due to a much deeper level of integration of Vietnam into the world’s economy. On the one hand, this phenomenon gives rise to efficiency of macro level policies because the consideration of micro-structural factors within the framework has definitely become increasingly critical. On the other and, this is a unique opportunity for the macroeconomic mechanism of Vietnam to improve vastly, given the context in which the national economy entered an everchanging period under pressures of globalization and re-integration. The authors hope to also open up paths for further empirical verifications and to stress on the fact that macro policies will have, from now on, to be decided in line with changing micro-settings, which specify a market economy and decide the degree of success of any macroeconomic choices.
Resumo:
info:eu-repo/semantics/published
Resumo:
info:eu-repo/semantics/published
Resumo:
info:eu-repo/semantics/published
Resumo:
L'ouvrage examine la pensée de Léo Strauss (1899-1973) et étudie à partir d'elle les stratégies d'exposition et de dissimulation de la philosophie. Les études qu'il réunit mesurent la portée de l'hypothèse d'un "art d'écrire oublié" et examinent la fécondité et les limites de la conception straussienne de l'écriture philosophique.