982 resultados para Random Access


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical model of random wave is developed using Stokes wave theory of water wave dynamics. A new nonlinear probability distribution function of wave height is presented. The results indicate that wave steepness not only could be a parameter of the distribution function of wave height but also could reflect the degree of wave height distribution deviation from the Rayleigh distribution. The new wave height distribution overcomes the problem of Rayleigh distribution that the prediction of big wave is overestimated and the general wave is underestimated. The prediction of small probability wave height value of new distribution is also smaller than that of Rayleigh distribution. Wave height data taken from East China Normal University are used to verify the new distribution. The results indicate that the new distribution fits the measurements much better than the Rayleigh distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Based on the second-order solutions obtained for the three-dimensional weakly nonlinear random waves propagating over a steady uniform current in finite water depth, the joint statistical distribution of the velocity and acceleration of the fluid particle in the current direction is derived using the characteristic function expansion method. From the joint distribution and the Morison equation, the theoretical distributions of drag forces, inertia forces and total random forces caused by waves propagating over a steady uniform current are determined. The distribution of inertia forces is Gaussian as that derived using the linear wave model, whereas the distributions of drag forces and total random forces deviate slightly from those derived utilizing the linear wave model. The distributions presented can be determined by the wave number spectrum of ocean waves, current speed and the second order wave-wave and wave-current interactions. As an illustrative example, for fully developed deep ocean waves, the parameters appeared in the distributions near still water level are calculated for various wind speeds and current speeds by using Donelan-Pierson-Banner spectrum and the effects of the current and the nonlinearity of ocean waves on the distribution are studied. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Duplications and rearrangements of coding genes are major themes in the evolution of mitochondrial genomes, bearing important consequences in the function of mitochondria and the fitness of organisms. Yu et al. (BMC Genomics 2008, 9: 477) reported the complete mt genome sequence of the oyster Crassostrea hongkongensis (16,475 bp) and found that a DNA segment containing four tRNA genes (trnK(1), trnC, trnQ(1) and trnN), a duplicated (rrnS) and a split rRNA gene (rrnL5') was absent compared with that of two other Crassostrea species. It was suggested that the absence was a novel case of "tandem duplication-random loss" with evolutionary significance. We independently sequenced the complete mt genome of three C. hongkongensis individuals, all of which were 18,622 bp and contained the segment that was missing in Yu et al.'s sequence. Further, we designed primers, verified sequences and demonstrated that the sequence loss in Yu et al.'s study was an artifact caused by placing primers in a duplicated region. The duplication and split of ribosomal RNA genes are unique for Crassostrea oysters and not lost in C. hongkongensis. Our study highlights the need for caution when amplifying and sequencing through duplicated regions of the genome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We seek to both detect and segment objects in images. To exploit both local image data as well as contextual information, we introduce Boosted Random Fields (BRFs), which uses Boosting to learn the graph structure and local evidence of a conditional random field (CRF). The graph structure is learned by assembling graph fragments in an additive model. The connections between individual pixels are not very informative, but by using dense graphs, we can pool information from large regions of the image; dense models also support efficient inference. We show how contextual information from other objects can improve detection performance, both in terms of accuracy and speed, by using a computational cascade. We apply our system to detect stuff and things in office and street scenes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Skin flap procedures are commonly used in plastic surgery. Failures can follow, leading to the necrosis of the flap. Therefore, many studies use LLLT to improve flap viability. Currently, the LED has been introduced as an alternative to LLLT. the objective of this study was to evaluate the effect of LLLT and LED on the viability of random skin flaps in rats. Forty-eight rats were divided into four groups, and a random skin flap (10 x 4 cm) was performed in all animals. Group 1 was the sham group; group 2 was submitted to LLLT 660 nm, 0.14 J; group 3 with LED 630 nm, 2.49 J, and group 4 with LLLT 660 nm, with 2.49 J. Irradiation was applied after surgery and repeated on the four subsequent days. On the 7th postoperative day, the percentage of flap necrosis was calculated and skin samples were collected from the viable area and from the transition line of the flap to evaluate blood vessels and mast cells. the percentage of necrosis was significantly lower in groups 3 and 4 compared to groups 1 and 2. Concerning blood vessels and mast cell numbers, only the animals in group 3 showed significant increase compared to group 1 in the skin sample of the transition line. LED and LLLT with the same total energies were effective in increasing viability of random skin flaps. LED was more effective in increasing the number of mast cells and blood vessels in the transition line of random skin flaps.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cumbers, B., Urquhart, C. & Durbin, J. (2006). Evaluation of the KA24 (Knowledge Access 24) service for health and social care staff in London and the South-East of England. Part 1: Quantitative. Health Information and Libraries Journal, 23(2), 133-139 Sponsorship: KA24 - NHS Trusts, London

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Durbin, J. & Urquhart, C. (2003). Qualitative evaluation of KA24 (Knowledge Access 24). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Knowledge Access 24 (NHS)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Canals, A.; Breen, A. R.; Ofman, L.; Moran, P. J.; Fallows, R. A., Estimating random transverse velocities in the fast solar wind from EISCAT Interplanetary Scintillation measurements, Annales Geophysicae, vol. 20, Issue 9, pp.1265-1277

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shen, Q., Zhao, R., Tang, W. (2008). Modelling random fuzzy renewal reward processes. IEEE Transactions on Fuzzy Systems, 16 (5),1379-1385

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The resolution passed by the BU University Council approving an initiative to establish an archive of the research and scholarship produced by the faculty of the University.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Internet has brought unparalleled opportunities for expanding availability of research by bringing down economic and physical barriers to sharing. The digitally networked environment promises to democratize access, carry knowledge beyond traditional research niches, accelerate discovery, encourage new and interdisciplinary approaches to ever more complex research challenges, and enable new computational research strategies. However, despite these opportunities for increasing access to knowledge, the prices of scholarly journals have risen sharply over the past two decades, often forcing libraries to cancel subscriptions. Today even the wealthiest institutions cannot afford to sustain all of the journals needed by their faculties and students. To take advantage of the opportunities created by the Internet and to further their mission of creating, preserving, and disseminating knowledge, many academic institutions are taking steps to capture the benefits of more open research sharing. Colleges and universities have built digital repositories to preserve and distribute faculty scholarly articles and other research outputs. Many individual authors have taken steps to retain the rights they need, under copyright law, to allow their work to be made freely available on the Internet and in their institutionâ s repository. And, faculties at some institutions have adopted resolutions endorsing more open access to scholarly articles. Most recently, on February 12, 2008, the Faculty of Arts and Sciences (FAS) at Harvard University took a landmark step. The faculty voted to adopt a policy requiring that faculty authors send an electronic copy of their scholarly articles to the universityâ s digital repository and that faculty authors automatically grant copyright permission to the university to archive and to distribute these articles unless a faculty member has waived the policy for a particular article. Essentially, the faculty voted to make open access to the results of their published journal articles the default policy for the Faculty of Arts and Sciences of Harvard University. As of March 2008, a proposal is also under consideration in the University of California system by which faculty authors would commit routinely to grant copyright permission to the university to make copies of the facultyâ s scholarly work openly accessible over the Internet. Inspired by the example set by the Harvard faculty, this White Paper is addressed to the faculty and administrators of academic institutions who support equitable access to scholarly research and knowledge, and who believe that the institution can play an important role as steward of the scholarly literature produced by its faculty. This paper discusses both the motivation and the process for establishing a binding institutional policy that automatically grants a copyright license from each faculty member to permit deposit of his or her peer-reviewed scholarly articles in institutional repositories, from which the works become available for others to read and cite.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On January 11, 2008, the National Institutes of Health ('NIH') adopted a revised Public Access Policy for peer-reviewed journal articles reporting research supported in whole or in part by NIH funds. Under the revised policy, the grantee shall ensure that a copy of the author's final manuscript, including any revisions made during the peer review process, be electronically submitted to the National Library of Medicine's PubMed Central ('PMC') archive and that the person submitting the manuscript will designate a time not later than 12 months after publication at which NIH may make the full text of the manuscript publicly accessible in PMC. NIH adopted this policy to implement a new statutory requirement under which: The Director of the National Institutes of Health shall require that all investigators funded by the NIH submit or have submitted for them to the National Library of Medicine's PubMed Central an electronic version of their final, peer-reviewed manuscripts upon acceptance for publication to be made publicly available no later than 12 months after the official date of publication: Provided, That the NIH shall implement the public access policy in a manner consistent with copyright law. This White Paper is written primarily for policymaking staff in universities and other institutional recipients of NIH support responsible for ensuring compliance with the Public Access Policy. The January 11, 2008, Public Access Policy imposes two new compliance mandates. First, the grantee must ensure proper manuscript submission. The version of the article to be submitted is the final version over which the author has control, which must include all revisions made after peer review. The statutory command directs that the manuscript be submitted to PMC 'upon acceptance for publication.' That is, the author's final manuscript should be submitted to PMC at the same time that it is sent to the publisher for final formatting and copy editing. Proper submission is a two-stage process. The electronic manuscript must first be submitted through a process that requires input of additional information concerning the article, the author(s), and the nature of NIH support for the research reported. NIH then formats the manuscript into a uniform, XML-based format used for PMC versions of articles. In the second stage of the submission process, NIH sends a notice to the Principal Investigator requesting that the PMC-formatted version be reviewed and approved. Only after such approval has grantee's manuscript submission obligation been satisfied. Second, the grantee also has a distinct obligation to grant NIH copyright permission to make the manuscript publicly accessible through PMC not later than 12 months after the date of publication. This obligation is connected to manuscript submission because the author, or the person submitting the manuscript on the author's behalf, must have the necessary rights under copyright at the time of submission to give NIH the copyright permission it requires. This White Paper explains and analyzes only the scope of the grantee's copyright-related obligations under the revised Public Access Policy and suggests six options for compliance with that aspect of the grantee's obligation. Time is of the essence for NIH grantees. As a practical matter, the grantee should have a compliance process in place no later than April 7, 2008. More specifically, the new Public Access Policy applies to any article accepted for publication on or after April 7, 2008 if the article arose under (1) an NIH Grant or Cooperative Agreement active in Fiscal Year 2008, (2) direct funding from an NIH Contract signed after April 7, 2008, (3) direct funding from the NIH Intramural Program, or (4) from an NIH employee. In addition, effective May 25, 2008, anyone submitting an application, proposal or progress report to the NIH must include the PMC reference number when citing articles arising from their NIH funded research. (This includes applications submitted to the NIH for the May 25, 2008 and subsequent due dates.) Conceptually, the compliance challenge that the Public Access Policy poses for grantees is easily described. The grantee must depend to some extent upon the author(s) to take the necessary actions to ensure that the grantee is in compliance with the Public Access Policy because the electronic manuscripts and the copyrights in those manuscripts are initially under the control of the author(s). As a result, any compliance option will require an explicit understanding between the author(s) and the grantee about how the manuscript and the copyright in the manuscript are managed. It is useful to conceptually keep separate the grantee's manuscript submission obligation from its copyright permission obligation because the compliance personnel concerned with manuscript management may differ from those responsible for overseeing the author's copyright management. With respect to copyright management, the grantee has the following six options: (1) rely on authors to manage copyright but also to request or to require that these authors take responsibility for amending publication agreements that call for transfer of too many rights to enable the author to grant NIH permission to make the manuscript publicly accessible ('the Public Access License'); (2) take a more active role in assisting authors in negotiating the scope of any copyright transfer to a publisher by (a) providing advice to authors concerning their negotiations or (b) by acting as the author's agent in such negotiations; (3) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License; (4) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License and also grants a license to the grantee to make certain uses of the article, including posting a copy in the grantee's publicly accessible digital archive or repository and authorizing the article to be used in connection with teaching by university faculty; (5) negotiate a more systematic and comprehensive agreement with the biomedical publishers to ensure either that the publisher has a binding obligation to submit the manuscript and to grant NIH permission to make the manuscript publicly accessible or that the author retains sufficient rights to do so; or (6) instruct NIH-funded authors to submit manuscripts only to journals with binding deposit agreements with NIH or to journals whose copyright agreements permit authors to retain sufficient rights to authorize NIH to make manuscripts publicly accessible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A working paper written for Boston University Libraries to foster discussion about how to provide better support for BU faculty authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent work in sensor databases has focused extensively on distributed query problems, notably distributed computation of aggregates. Existing methods for computing aggregates broadcast queries to all sensors and use in-network aggregation of responses to minimize messaging costs. In this work, we focus on uniform random sampling across nodes, which can serve both as an alternative building block for aggregation and as an integral component of many other useful randomized algorithms. Prior to our work, the best existing proposals for uniform random sampling of sensors involve contacting all nodes in the network. We propose a practical method which is only approximately uniform, but contacts a number of sensors proportional to the diameter of the network instead of its size. The approximation achieved is tunably close to exact uniform sampling, and only relies on well-known existing primitives, namely geographic routing, distributed computation of Voronoi regions and von Neumann's rejection method. Ultimately, our sampling algorithm has the same worst-case asymptotic cost as routing a point-to-point message, and thus it is asymptotically optimal among request/reply-based sampling methods. We provide experimental results demonstrating the effectiveness of our algorithm on both synthetic and real sensor topologies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The World Wide Web (WWW or Web) is growing rapidly on the Internet. Web users want fast response time and easy access to a enormous variety of information across the world. Thus, performance is becoming a main issue in the Web. Fractals have been used to study fluctuating phenomena in many different disciplines, from the distribution of galaxies in astronomy to complex physiological control systems. The Web is also a complex, irregular, and random system. In this paper, we look at the document reference pattern at Internet Web servers and use fractal-based models to understand aspects (e.g. caching schemes) that affect the Web performance.