9 resultados para arrival dates

em Boston University Digital Common


Relevância:

10.00% 10.00%

Publicador:

Resumo:

On January 11, 2008, the National Institutes of Health ('NIH') adopted a revised Public Access Policy for peer-reviewed journal articles reporting research supported in whole or in part by NIH funds. Under the revised policy, the grantee shall ensure that a copy of the author's final manuscript, including any revisions made during the peer review process, be electronically submitted to the National Library of Medicine's PubMed Central ('PMC') archive and that the person submitting the manuscript will designate a time not later than 12 months after publication at which NIH may make the full text of the manuscript publicly accessible in PMC. NIH adopted this policy to implement a new statutory requirement under which: The Director of the National Institutes of Health shall require that all investigators funded by the NIH submit or have submitted for them to the National Library of Medicine's PubMed Central an electronic version of their final, peer-reviewed manuscripts upon acceptance for publication to be made publicly available no later than 12 months after the official date of publication: Provided, That the NIH shall implement the public access policy in a manner consistent with copyright law. This White Paper is written primarily for policymaking staff in universities and other institutional recipients of NIH support responsible for ensuring compliance with the Public Access Policy. The January 11, 2008, Public Access Policy imposes two new compliance mandates. First, the grantee must ensure proper manuscript submission. The version of the article to be submitted is the final version over which the author has control, which must include all revisions made after peer review. The statutory command directs that the manuscript be submitted to PMC 'upon acceptance for publication.' That is, the author's final manuscript should be submitted to PMC at the same time that it is sent to the publisher for final formatting and copy editing. Proper submission is a two-stage process. The electronic manuscript must first be submitted through a process that requires input of additional information concerning the article, the author(s), and the nature of NIH support for the research reported. NIH then formats the manuscript into a uniform, XML-based format used for PMC versions of articles. In the second stage of the submission process, NIH sends a notice to the Principal Investigator requesting that the PMC-formatted version be reviewed and approved. Only after such approval has grantee's manuscript submission obligation been satisfied. Second, the grantee also has a distinct obligation to grant NIH copyright permission to make the manuscript publicly accessible through PMC not later than 12 months after the date of publication. This obligation is connected to manuscript submission because the author, or the person submitting the manuscript on the author's behalf, must have the necessary rights under copyright at the time of submission to give NIH the copyright permission it requires. This White Paper explains and analyzes only the scope of the grantee's copyright-related obligations under the revised Public Access Policy and suggests six options for compliance with that aspect of the grantee's obligation. Time is of the essence for NIH grantees. As a practical matter, the grantee should have a compliance process in place no later than April 7, 2008. More specifically, the new Public Access Policy applies to any article accepted for publication on or after April 7, 2008 if the article arose under (1) an NIH Grant or Cooperative Agreement active in Fiscal Year 2008, (2) direct funding from an NIH Contract signed after April 7, 2008, (3) direct funding from the NIH Intramural Program, or (4) from an NIH employee. In addition, effective May 25, 2008, anyone submitting an application, proposal or progress report to the NIH must include the PMC reference number when citing articles arising from their NIH funded research. (This includes applications submitted to the NIH for the May 25, 2008 and subsequent due dates.) Conceptually, the compliance challenge that the Public Access Policy poses for grantees is easily described. The grantee must depend to some extent upon the author(s) to take the necessary actions to ensure that the grantee is in compliance with the Public Access Policy because the electronic manuscripts and the copyrights in those manuscripts are initially under the control of the author(s). As a result, any compliance option will require an explicit understanding between the author(s) and the grantee about how the manuscript and the copyright in the manuscript are managed. It is useful to conceptually keep separate the grantee's manuscript submission obligation from its copyright permission obligation because the compliance personnel concerned with manuscript management may differ from those responsible for overseeing the author's copyright management. With respect to copyright management, the grantee has the following six options: (1) rely on authors to manage copyright but also to request or to require that these authors take responsibility for amending publication agreements that call for transfer of too many rights to enable the author to grant NIH permission to make the manuscript publicly accessible ('the Public Access License'); (2) take a more active role in assisting authors in negotiating the scope of any copyright transfer to a publisher by (a) providing advice to authors concerning their negotiations or (b) by acting as the author's agent in such negotiations; (3) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License; (4) enter into a side agreement with NIH-funded authors that grants a non-exclusive copyright license to the grantee sufficient to grant NIH the Public Access License and also grants a license to the grantee to make certain uses of the article, including posting a copy in the grantee's publicly accessible digital archive or repository and authorizing the article to be used in connection with teaching by university faculty; (5) negotiate a more systematic and comprehensive agreement with the biomedical publishers to ensure either that the publisher has a binding obligation to submit the manuscript and to grant NIH permission to make the manuscript publicly accessible or that the author retains sufficient rights to do so; or (6) instruct NIH-funded authors to submit manuscripts only to journals with binding deposit agreements with NIH or to journals whose copyright agreements permit authors to retain sufficient rights to authorize NIH to make manuscripts publicly accessible.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This collection primarily contains correspondence from Wright’s years as president of ASOR. Material dates as far back as 1957, and proceed into the early 1970’s. Some of Wright’s more notable correspondents include William F. Albright, A. Henry Detweiler, Paul W. Lapp, William Reed, and Dean Seiler. Subject-specific correspondence includes records of expenditures, budget planning, corporate memberships, and the Jerusalem School.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The issue of ancestors has been controversial since the first encounters of Christianity with Shona religion. It remains a major theological problem that needs to be addressed within the mainline churches of Zimbabwe today. Instead of ignoring or dismissing the ancestor cult, which deeply influences the socio-political, religious, and economic lives of the Shona, churches in Zimbabwe should initiate a Christology that is based on it. Such a Christology would engage the critical day-to-day issues that make the Shona turn to their ancestors. Among these concerns are daily protection from misfortune, maintaining good health and increasing longevity, successful rainy seasons and food security, and responsible governance characterized by economic and political stability. Since the mid-16th century arrival of Jesuit missionaries in the Mutapa Kingdom, the Church has realized that many African Christians resorted to their ancestors in times of crisis. Although both Catholic and Protestant missionaries from the 1700s through the early 1900s fiercely attacked Shona traditional beliefs as superstitious and equated ancestors with evil spirits, the cult did not die. Social institutions, such as schools and hospitals provided by missionaries, failed to eliminate ancestral beliefs. Even in the 21st century, many Zimbabweans consult their ancestors. The Shona message to the church remains "Not without My Ancestors." This dissertation examines the significance of the ancestors to the Shona, and how selected denominations and new religious movements have interpreted and accommodated ancestral practices. Taking the missiological goal of "self-theologizing" as the framework, this dissertation proposes a "tripartite Christology" of "Jesus the Family Ancestor", "Jesus the Tribal Ancestor," and "Jesus the National Ancestor," which is based on the Shona "tripartite ancestrology." Familiar ecclesiological and liturgical language, idioms, and symbols are used to contribute to the wider Shona understanding of Jesus as the ancestor par excellence, in whom physical and spiritual needs-including those the ordinary ancestors fail to meet-are fulfilled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-range dependence has been observed in many recent Internet traffic measurements. In addition, some recent studies have shown that under certain network conditions, TCP itself can produce traffic that exhibits dependence over limited timescales, even in the absence of higher-level variability. In this paper, we use a simple Markovian model to argue that when the loss rate is relatively high, TCP's adaptive congestion control mechanism indeed generates traffic with OFF periods exhibiting power-law shape over several timescales and thus introduces pseudo-long-range dependence into the overall traffic. Moreover, we observe that more variable initial retransmission timeout values for different packets introduces more variable packet inter-arrival times, which increases the burstiness of the overall traffic. We can thus explain why a single TCP connection can produce a time-series that can be misidentified as self-similar using standard tests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To serve asynchronous requests using multicast, two categories of techniques, stream merging and periodic broadcasting have been proposed. For sequential streaming access where requests are uninterrupted from the beginning to the end of an object, these techniques are highly scalable: the required server bandwidth for stream merging grows logarithmically as request arrival rate, and the required server bandwidth for periodic broadcasting varies logarithmically as the inverse of start-up delay. However, sequential access is inappropriate to model partial requests and client interactivity observed in various streaming access workloads. This paper analytically and experimentally studies the scalability of multicast delivery under a non-sequential access model where requests start at random points in the object. We show that the required server bandwidth for any protocols providing immediate service grows at least as the square root of request arrival rate, and the required server bandwidth for any protocols providing delayed service grows linearly with the inverse of start-up delay. We also investigate the impact of limited client receiving bandwidth on scalability. We optimize practical protocols which provide immediate service to non-sequential requests. The protocols utilize limited client receiving bandwidth, and they are near-optimal in that the required server bandwidth is very close to its lower bound.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current Internet transport protocols make end-to-end measurements and maintain per-connection state to regulate the use of shared network resources. When a number of such connections share a common endpoint, that endpoint has the opportunity to correlate these end-to-end measurements to better diagnose and control the use of shared resources. A valuable characterization of such shared resources is the "loss topology". From the perspective of a server with concurrent connections to multiple clients, the loss topology is a logical tree rooted at the server in which edges represent lossy paths between a pair of internal network nodes. We develop an end-to-end unicast packet probing technique and an associated analytical framework to: (1) infer loss topologies, (2) identify loss rates of links in an existing loss topology, and (3) augment a topology to incorporate the arrival of a new connection. Correct, efficient inference of loss topology information enables new techniques for aggregate congestion control, QoS admission control, connection scheduling and mirror site selection. Our extensive simulation results demonstrate that our approach is robust in terms of its accuracy and convergence over a wide range of network conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The heterogeneity and open nature of network systems make analysis of compositions of components quite challenging, making the design and implementation of robust network services largely inaccessible to the average programmer. We propose the development of a novel type system and practical type spaces which reflect simplified representations of the results and conclusions which can be derived from complex compositional theories in more accessible ways, essentially allowing the system architect or programmer to be exposed only to the inputs and output of compositional analysis without having to be familiar with the ins and outs of its internals. Toward this end we present the TRAFFIC (Typed Representation and Analysis of Flows For Interoperability Checks) framework, a simple flow-composition and typing language with corresponding type system. We then discuss and demonstrate the expressive power of a type space for TRAFFIC derived from the network calculus, allowing us to reason about and infer such properties as data arrival, transit, and loss rates in large composite network applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate the process underlying audiovisual speech perception, the McGurk illusion was examined across a range of phonetic contexts. Two major changes were found. First, the frequency of illusory /g/ fusion percepts increased relative to the frequency of illusory /d/ fusion percepts as vowel context was shifted from /i/ to /a/ to /u/. This trend could not be explained by biases present in perception of the unimodal visual stimuli. However, the change found in the McGurk fusion effect across vowel environments did correspond systematically with changes in second format frequency patterns across contexts. Second, the order of consonants in illusory combination percepts was found to depend on syllable type. This may be due to differences occuring across syllable contexts in the timecourses of inputs from the two modalities as delaying the auditory track of a vowel-consonant stimulus resulted in a change in the order of consonants perceived. Taken together, these results suggest that the speech perception system either fuses audiovisual inputs into a visually compatible percept with a similar second formant pattern to that of the acoustic stimulus or interleaves the information from different modalities, at a phonemic or subphonemic level, based on their relative arrival times.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The human urge to represent the three-dimensional world using two-dimensional pictorial representations dates back at least to Paleolithic times. Artists from ancient to modern times have struggled to understand how a few contours or color patches on a flat surface can induce mental representations of a three-dimensional scene. This article summarizes some of the recent breakthroughs in scientifically understanding how the brain sees that shed light on these struggles. These breakthroughs illustrate how various artists have intuitively understand paradoxical properties about how the brain sees, and have used that understanding to create great art. These paradoxical properties arise from how the brain forms the units of conscious visual perception; namely, representations of three-dimensional boundaries and surfaces. Boundaries and surfaces are computed in parallel cortical processing streams that obey computationally complementary properties. These streams interact at multiple levels to overcome their complementary weaknesses and to transform their complementary properties into consistent percepts. The article describes how properties of complementary consistency have guided the creation of many great works of art.