983 resultados para Access Reform
Resumo:
The stability of scheduled multiaccess communication with random coding and independent decoding of messages is investigated. The number of messages that may be scheduled for simultaneous transmission is limited to a given maximum value, and the channels from transmitters to receiver are quasistatic, flat, and have independent fades. Requests for message transmissions are assumed to arrive according to an i.i.d. arrival process. Then, we show the following: (1) in the limit of large message alphabet size, the stability region has an interference limited information-theoretic capacity interpretation, (2) state-independent scheduling policies achieve this asymptotic stability region, and (3) in the asymptotic limit corresponding to immediate access, the stability region for non-idling scheduling policies is shown to be identical irrespective of received signal powers.
Resumo:
The thesis examines homeowners associations as a part of the large-scale housing reform, implemented in Russia since 2005. The reform transferred housing management from the public sector to the private sector and to the citizens responsibility. The reform is a continuation to the privatisation of the housing stock that was started in Russia in the beginning of the 1990s, aiming to build a market-oriented housing sector in the country. The reform makes a fundamental change to the Soviet system, in which ownership along with management and maintenance of housing were monopolised by the state. Homeowners are now responsible for the management of the common areas in privatised houses, which is often realised by establishing a homeowners association. Homeowners associations are examined by using the so-called common-pool resource regime approach, with the main question being the ways in which taking care of common property collectively succeeds in practice. The study is based on interview data of St. Petersburg s homeowners associations. Using the common-pool resource theory the study demonstrates why implementation of the housing reform has not succeeded as expected. Certain elements that characterise a successful common-pool resource regime do not fulfill sufficiently in St. Petersburg s homeowners associations. Firstly, free-riding, that is, withdrawal from the association s joint decision-making and not making the housing payments is common, as effective sanctions to prevent it are missing in the legislation. That is, eviction or expelling a non-paying member from the association is not possible. Secondly, ownership of the land plot and common areas of the house, such as basements and attics, are often disputed between the associations and authorities. In the Soviet era, these common areas were public property along with the apartments, but in privatised houses they should, according to the legislation, belong to the associations property. Thirdly, solution of disputes between the associations and authorities and within the associations is difficult, as the court system tends to be bureaucratic and inefficient. In addition to the common-pool resource approach, the study also examines how social capital contributes to the associations effectiveness and democratic governance. The study finds that although homeowners associations have increased cooperation and tightened social relations between neighbours, social capital has not been able to prevent free-riding. The study shows that unlike it is often claimed, the so-called Soviet mentality , that is, residents passiveness and unwillingness to participate, is not the most important obstacle to the reform. Instead, the reform is impeded most of all by imperfect institutional arrangements and local authorities that prevent the associations from working as independent, self-governing associations.
Resumo:
RECONNECT is a Network-on-Chip using a honeycomb topology. In this paper we focus on properties of general rules applicable to a variety of routing algorithms for the NoC which take into account the missing links of the honeycomb topology when compared to a mesh. We also extend the original proposal [5] and show a method to insert and extract data to and from the network. Access Routers at the boundary of the execution fabric establish connections to multiple periphery modules and create a torus to decrease the node distances. Our approach is scalable and ensures homogeneity among the compute elements in the NoC. We synthesized and evaluated the proposed enhancement in terms of power dissipation and area. Our results indicate that the impact of necessary alterations to the fabric is negligible and effects the data transfer between the fabric and the periphery only marginally.
Resumo:
Changes in taxation of corporate dividends offer excellent opportunities to study dividend clientele effects. We explore payout policies and ownership structures around a major tax reform that took place in Finland in 2004. Consistent with dividend clienteles affecting firms’ dividend policy decisions, we find that Finnish firms altered their dividend policies based on the changed tax incentives of their largest shareholders. While firms adjust their payout policies, our results also indicate that ownership structures of Finnish firms also changed around the 2004 reform, consistent with shareholder clienteles adjusting to the new tax system.
Resumo:
The open access (OA) model for journals is compared to the open source principle for computer software. Since the early 1990s nearly 1,000 OA scientific journals have emerged – mostly as voluntary community efforts, although recently some professionally operating publishers have used author charges or institutional membership. This study of OA journals without author charges shows that their impact is still relatively small, but awareness of it is increasing. The average number of research articles per year is lower than for major scientific journals but the publication times are shorter.
Resumo:
Background: The Internet has recently made possible the free global availability of scientific journal articles. Open Access (OA) can occur either via OA scientific journals, or via authors posting manuscripts of articles published in subscription journals in open web repositories. So far there have been few systematic studies showing how big the extent of OA is, in particular studies covering all fields of science. Methodology/Principal Findings: The proportion of peer reviewed scholarly journal articles, which are available openly in full text on the web, was studied using a random sample of 1837 titles and a web search engine. Of articles published in 2008, 8,5% were freely available at the publishers’ sites. For an additional 11,9% free manuscript versions could be found using search engines, making the overall OA percentage 20,4%. Chemistry (13%) had the lowest overall share of OA, Earth Sciences (33%) the highest. In medicine, biochemistry and chemistry publishing in OA journals was more common. In all other fields author-posted manuscript copies dominated the picture. Conclusions/Significance: The results show that OA already has a significant positive impact on the availability of the scientific journal literature and that there are big differences between scientific disciplines in the uptake. Due to the lack of awareness of OA-publishing among scientists in most fields outside physics, the results should be of general interest to all scholars. The results should also interest academic publishers, who need to take into account OA in their business strategies and copyright policies, as well as research funders, who like the NIH are starting to require OA availability of results from research projects they fund. The method and search tools developed also offer a good basis for more in-depth studies as well as longitudinal studies.
Resumo:
One of the effects of the Internet is that the dissemination of scientific publications in a few years has migrated to electronic formats. The basic business practices between libraries and publishers for selling and buying the content, however, have not changed much. In protest against the high subscription prices of mainstream publishers, scientists have started Open Access (OA) journals and e-print repositories, which distribute scientific information freely. Despite widespread agreement among academics that OA would be the optimal distribution mode for publicly financed research results, such channels still constitute only a marginal phenomenon in the global scholarly communication system. This paper discusses, in view of the experiences of the last ten years, the many barriers hindering a rapid proliferation of Open Access. The discussion is structured according to the main OA channels; peer-reviewed journals for primary publishing, subject- specific and institutional repositories for secondary parallel publishing. It also discusses the types of barriers, which can be classified as consisting of the legal framework, the information technology infrastructure, business models, indexing services and standards, the academic reward system, marketing, and critical mass.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
The Internet has made possible the cost-effective dissemination of scientific journals in the form of electronic versions, usually in parallel with the printed versions. At the same time the electronic medium also makes possible totally new open access (OA) distribution models, funded by author charges, sponsorship, advertising, voluntary work, etc., where the end product is free in full text to the readers. Although more than 2,000 new OA journals have been founded in the last 15 years, the uptake of open access has been rather slow, with currently around 5% of all peer-reviewed articles published in OA journals. The slow growth can to a large extent be explained by the fact that open access has predominantly emerged via newly founded journals and startup publishers. Established journals and publishers have not had strong enough incentives to change their business models, and the commercial risks in doing so have been high. In this paper we outline and discuss two different scenarios for how scholarly publishers could change their operating model to open access. The first is based on an instantaneous change and the second on a gradual change. We propose a way to manage the gradual change by bundling traditional “big deal” licenses and author charges for opening access to individual articles.
Resumo:
Introduction This case study is based on the experiences with the Electronic Journal of Information Technology in Construction (ITcon), founded in 1995. Development This journal is an example of a particular category of open access journals, which use neither author charges nor subscriptions to finance their operations, but rely largely on unpaid voluntary work in the spirit of the open source movement. The journal has, after some initial struggle, survived its first decade and is now established as one of half-a-dozen peer reviewed journals in its field. Operations The journal publishes articles as they become ready, but creates virtual issues through alerting messages to “subscribers”. It has also started to publish special issues, since this helps in attracting submissions, and also helps in sharing the work-load of review management. From the start the journal adopted a rather traditional layout of the articles. After the first few years the HTML version was dropped and papers are only published in PDF format. Performance The journal has recently been benchmarked against the competing journals in its field. Its acceptance rate of 53% is slightly higher and its average turnaround time of seven months almost a year faster compared to those journals in the sample for which data could be obtained. The server log files for the past three years have also been studied. Conclusions Our overall experience demonstrates that it is possible to publish this type of OA journal, with a yearly publishing volume equal to a quarterly journal and involving the processing of some fifty submissions a year, using a networked volunteer-based organization.
Resumo:
Introduction. We estimate the total yearly volume of peer-reviewed scientific journal articles published world-wide as well as the share of these articles available openly on the Web either directly or as copies in e-print repositories. Method. We rely on data from two commercial databases (ISI and Ulrich's Periodicals Directory) supplemented by sampling and Google searches. Analysis. A central issue is the finding that ISI-indexed journals publish far more articles per year (111) than non ISI-indexed journals (26), which means that the total figure we obtain is much lower than many earlier estimates. Our method of analysing the number of repository copies (green open access) differs from several earlier studies which have studied the number of copies in identified repositories, since we start from a random sample of articles and then test if copies can be found by a Web search engine. Results. We estimate that in 2006 the total number of articles published was approximately 1,350,000. Of this number 4.6% became immediately openly available and an additional 3.5% after an embargo period of, typically, one year. Furthermore, usable copies of 11.3% could be found in subject-specific or institutional repositories or on the home pages of the authors. Conclusions. We believe our results are the most reliable so far published and, therefore, should be useful in the on-going debate about Open Access among both academics and science policy makers. The method is replicable and also lends itself to longitudinal studies in the future.
Resumo:
Open access is a new model for the publishing of scientific journals enabled by the Internet, in which the published articles are freely available for anyone to read. During the 1990’s hundreds of individual open access journals were founded by groups of academics, supported by grants and unpaid voluntary work. During the last five years other types of open access journals, funded by author charges have started to emerge and also established publishers have started to experiment with different variations of open access. This article reports on the experiences of one open access journal (The Electronic Journal of Information Technology in Construction, ITcon) over its ten year history. In addition to a straightforward account of the lessons learned the journal is also benchmarked against a number of competitors in the same research area and its development is put into the larger perspective of changes in scholarly publishing. The main findings are: That a journal publishing around 20-30 articles per year, equivalent to a typical quarterly journal, can sustainable be produced using an open source like production model. The journal outperforms its competitors in some respects, such as the speed of publication, availability of the results and balanced global distribution of authorship, and is on a par with them in most other respects. The key statistics for ITcon are: Acceptance rate 55 %. Average speed of publication 6-7 months. 801 subscribers to email alerts. Average number of downloads by human readers per paper per month 21.
Resumo:
Kirjastoissa ja yliopistoissa tapahtuvaa tieteellisten töiden verkkokäyttöä koskevat tekijänoikeudelliset kysymykset ovat viimeaikoina aiheuttaneet päänvaivaa. Tietoverkot ja digitaalinen ympäristö muodostavatkin tekijänoikeuden kannalta erityisen soveltamisympäristön johon perehtyminen edellyttää tarkempaa tietämystä tiedon siirtämisestä, tietokannoista sekä ylipäätään tietoverkkoihin liittyvistä teknisistä toiminnoista. Koska sovelletut tekniset ratkaisut poikkeavat eri yhteyksissä toisistaan, pyrin kirjoituksessa yleisellä tasolla selvittämään niitä käyttäjien ja oikeudenhaltijoiden välisiä tekijän- ja sopimusoikeudellisia kysymyksiä, joita teosten käyttö tietoverkoissa aiheuttaa. Pyrkimyksenä on tuoda esiin ne tekijänoikeudellisesti merkitykselliset seikat, jotka verkkojulkaisuja arkistoitaessa, välitettäessä sekä linkkejä käytettäessä tulisi alkuperäisten tekijöiden, kustantajien ja verkkojulkaisijoiden (esimerkiksi kirjasto tai yliopisto) välisissä sopimuksissa ottaa huomioon. Kysymyksiä tarkastellaan erityisesti julkaisijan näkökulmasta. Esitys sisältää myös kustantajien lupakäytäntöä käsittelevän empiirisen tutkimuksen. Tutkimuksessa on tarkasteltu kuinka usein kustantajat ovat vuosien 2000 – 2003 välisenä aikana myöntäneet luvan julkaista väitöskirjan artikkeli osana väitöskirjaa Teknillisen korkeakoulun avoimella ei kaupallisella www-palvelimella. Koska linkeillä on verkkojulkaisutoiminnassa usein merkittävä rooli, mutta niiden tekijänoikeudellinen asema on epäselvä, kirjoituksen jälkimmäisessä osiossa perehdytään linkkien tekijänoikeudelliseen asemaan.
Resumo:
In this paper we address the problem of distributed transmission of functions of correlated sources over a fast fading multiple access channel (MAC). This is a basic building block in a hierarchical sensor network used in estimating a random field where the cluster head is interested only in estimating a function of the observations. The observations are transmitted to the cluster head through a fast fading MAC. We provide sufficient conditions for lossy transmission when the encoders and decoders are provided with partial information about the channel state. Furthermore signal side information maybe available at the encoders and the decoder. Various previous studies are shown as special cases. Efficient joint-source channel coding schemes are discussed for transmission of discrete and continuous alphabet sources to recover function values.