996 resultados para sharing features
Resumo:
Islamic financing instruments can be categorised into profit and loss/risk sharing and non-participatory instruments. Although profit and loss sharing instruments such as musharakah are widely accepted as the ideal form of Islamic financing, prior studies suggest that alternative instruments such as murabahah are preferred by Islamic banks. Nevertheless, prior studies did not explore factors that influence the use of Islamic financing among non-financial firms. Our study fills this gap and contributes new knowledge in several ways. First, we find no evidence of widespread use of Islamic financing instruments across non-financial firms. This is because the instruments are mostly used by less profitable firms with higher leverage (i.e., risky firms). Second, we find that profit and loss sharing instruments are hardly used, whilst the use of murabahah is dominant. Consistent with the prediction of moral-hazard-risk avoidance theory, further analysis suggests that users with a lower asset base (to serve as collateral) are associated with murabahah financing. Third, we present a critical discourse on the contentious nature of murabahah as practised. The economic significance and ethical issues associated with murabahah as practised should trigger serious efforts to steer Islamic corporate financing towards risk-sharing more than the controversial rent-seeking practice.
Resumo:
Durbin, J. & Urquhart, C. (2003). Qualitative evaluation of KA24 (Knowledge Access 24). Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: Knowledge Access 24 (NHS)
Resumo:
Q. Shen and R. Jensen, 'Selecting Informative Features with Fuzzy-Rough Sets and its Application for Complex Systems Monitoring,' Pattern Recognition, vol. 37, no. 7, pp. 1351-1363, 2004.
Resumo:
Griffiths, M. (2005). Children drawing toy commercials: re-imagining television production features. Visual Communication. 4(1), pp.21-37. RAE2008
Resumo:
UPNa. Instituto de Agrobiotecnología. Laboratorio de Biofilms Microbianos
Resumo:
The congestion control mechanisms of TCP make it vulnerable in an environment where flows with different congestion-sensitivity compete for scarce resources. With the increasing amount of unresponsive UDP traffic in today's Internet, new mechanisms are needed to enforce fairness in the core of the network. We propose a scalable Diffserv-like architecture, where flows with different characteristics are classified into separate service queues at the routers. Such class-based isolation provides protection so that flows with different characteristics do not negatively impact one another. In this study, we examine different aspects of UDP and TCP interaction and possible gains from segregating UDP and TCP into different classes. We also investigate the utility of further segregating TCP flows into two classes, which are class of short and class of long flows. Results are obtained analytically for both Tail-drop and Random Early Drop (RED) routers. Class-based isolation have the following salient features: (1) better fairness, (2) improved predictability for all kinds of flows, (3) lower transmission delay for delay-sensitive flows, and (4) better control over Quality of Service (QoS) of a particular traffic type.
Resumo:
A foundational issue underlying many overlay network applications ranging from routing to P2P file sharing is that of connectivity management, i.e., folding new arrivals into the existing mesh and re-wiring to cope with changing network conditions. Previous work has considered the problem from two perspectives: devising practical heuristics for specific applications designed to work well in real deployments, and providing abstractions for the underlying problem that are tractable to address via theoretical analyses, especially game-theoretic analysis. Our work unifies these two thrusts first by distilling insights gleaned from clean theoretical models, notably that under natural resource constraints, selfish players can select neighbors so as to efficiently reach near-equilibria that also provide high global performance. Using Egoist, a prototype overlay routing system we implemented on PlanetLab, we demonstrate that our neighbor selection primitives significantly outperform existing heuristics on a variety of performance metrics; that Egoist is competitive with an optimal, but unscalable full-mesh approach; and that it remains highly effective under significant churn. We also describe variants of Egoist's current design that would enable it to scale to overlays of much larger scale and allow it to cater effectively to applications, such as P2P file sharing in unstructured overlays, based on the use of primitives such as scoped-flooding rather than routing.
Resumo:
We propose Trade & Cap (T&C), an economics-inspired mechanism that incentivizes users to voluntarily coordinate their consumption of the bandwidth of a shared resource (e.g., a DSLAM link) so as to converge on what they perceive to be an equitable allocation, while ensuring efficient resource utilization. Under T&C, rather than acting as an arbiter, an Internet Service Provider (ISP) acts as an enforcer of what the community of rational users sharing the resource decides is a fair allocation of that resource. Our T&C mechanism proceeds in two phases. In the first, software agents acting on behalf of users engage in a strategic trading game in which each user agent selfishly chooses bandwidth slots to reserve in support of primary, interactive network usage activities. In the second phase, each user is allowed to acquire additional bandwidth slots in support of presumed open-ended need for fluid bandwidth, catering to secondary applications. The acquisition of this fluid bandwidth is subject to the remaining "buying power" of each user and by prevalent "market prices" – both of which are determined by the results of the trading phase and a desirable aggregate cap on link utilization. We present analytical results that establish the underpinnings of our T&C mechanism, including game-theoretic results pertaining to the trading phase, and pricing of fluid bandwidth allocation pertaining to the capping phase. Using real network traces, we present extensive experimental results that demonstrate the benefits of our scheme, which we also show to be practical by highlighting the salient features of an efficient implementation architecture.
Resumo:
In this paper we present Statistical Rate Monotonic Scheduling (SRMS), a generalization of the classical RMS results of Liu and Layland that allows scheduling periodic tasks with highly variable execution times and statistical QoS requirements. Similar to RMS, SRMS has two components: a feasibility test and a scheduling algorithm. The feasibility test for SRMS ensures that using SRMS' scheduling algorithms, it is possible for a given periodic task set to share a given resource (e.g. a processor, communication medium, switching device, etc.) in such a way that such sharing does not result in the violation of any of the periodic tasks QoS constraints. The SRMS scheduling algorithm incorporates a number of unique features. First, it allows for fixed priority scheduling that keeps the tasks' value (or importance) independent of their periods. Second, it allows for job admission control, which allows the rejection of jobs that are not guaranteed to finish by their deadlines as soon as they are released, thus enabling the system to take necessary compensating actions. Also, admission control allows the preservation of resources since no time is spent on jobs that will miss their deadlines anyway. Third, SRMS integrates reservation-based and best-effort resource scheduling seamlessly. Reservation-based scheduling ensures the delivery of the minimal requested QoS; best-effort scheduling ensures that unused, reserved bandwidth is not wasted, but rather used to improve QoS further. Fourth, SRMS allows a system to deal gracefully with overload conditions by ensuring a fair deterioration in QoS across all tasks---as opposed to penalizing tasks with longer periods, for example. Finally, SRMS has the added advantage that its schedulability test is simple and its scheduling algorithm has a constant overhead in the sense that the complexity of the scheduler is not dependent on the number of the tasks in the system. We have evaluated SRMS against a number of alternative scheduling algorithms suggested in the literature (e.g. RMS and slack stealing), as well as refinements thereof, which we describe in this paper. Consistently throughout our experiments, SRMS provided the best performance. In addition, to evaluate the optimality of SRMS, we have compared it to an inefficient, yet optimal scheduler for task sets with harmonic periods.
Resumo:
Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Twodimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/.
Resumo:
This thesis presents research theorising the use of social network sites (SNS) for the consumption of cultural goods. SNS are Internet-based applications that enable people to connect, interact, discover, and share user-generated content. They have transformed communication practices and are facilitating users to present their identity online through the disclosure of information on a profile. SNS are especially effective for propagating content far and wide within a network of connections. Cultural goods constitute hedonic experiential goods with cultural, artistic, and entertainment value, such as music, books, films, and fashion. Their consumption is culturally dependant and they have unique characteristics that distinguish them from utilitarian products. The way in which users express their identity on SNS is through the sharing of cultural interests and tastes. This makes cultural good consumption vulnerable to the exchange of content and ideas that occurs across an expansive network of connections within these social systems. This study proposes the lens of affordances to theorise the use of social network sites for the consumption of cultural goods. Qualitative case study research using two phases of data collection is proposed in the application of affordances to the research topic. The interaction between task, technology, and user characteristics is investigated by examining each characteristic in detail, before investigating the actual interaction between the user and the artifact for a particular purpose. The study contributes to knowledge by (i) improving our understanding of the affordances of social network sites for the consumption of cultural goods, (ii) demonstrating the role of task, technology and user characteristics in mediating user behaviour for user-artifact interactions, (iii) explaining the technical features and user activities important to the process of consuming cultural goods using social network sites, and (iv) theorising the consumption of cultural goods using SNS by presenting a theoretical research model which identifies empirical indicators of model constructs and maps out affordance dependencies and hierarchies. The study also provides a systematic research process for applying the concept of affordances to the study of system use.
Resumo:
It is apparent from the widespread distribution of burnt mounds that Ireland was the most prolific user of pyrolithic technology in Bronze Age Europe. Even though burnt mounds are the most common prehistoric site type in Ireland, they have not received the same level of research as other prehistoric sites. This is primarily due to the paucity of artefact finds and the unspectacular nature of the archaeological remains, compounded by the absence of an appropriate research framework. Due to the widespread use of the technology and the various applications of hot water, narratives related to these sites have revolved around discussions of age and function. This has resulted in a generalised classification, where the term ‘fulacht fia’ covers several site types that have similar features but differing functions and age. The study presents a re-evaluation of fulachtaí fia in light of some 1000 sites excavated in Ireland. This is the most comprehensive study undertaken on the use of pyrolithic technology in prehistoric Ireland, dealing with different aspects of site function, chronology, social role and cultural context. A number of key areas have been identified in relation to our understanding of these sites. Previous investigations of burnt mounds have provided little information on the temporality of individual sites. It has been established that appropriate sampling strategies can provide important information about the formation of individual sites, their relationships to each other and to other monuments in the same cultural landscape. The evidence suggests that considerable caution should be exercised with regard to certain single radiometric dates from burnt stone deposits, based on the degree of certainty of the dated sample and its association with pyrolithic activity. Previously regarded as Bronze Age in date, there are now numerous examples of pyrolithic-type processes in earlier contexts, with the origins of the water-boiling phenomenon now considered to be Early Neolithic. A review of recent excavation evidence provides new insights into the use of pyrolithic technology for cooking. This is based on the discovery of faunal remains at several sites, combined with insights gained through experimental studies. The model proposed here is of open-air communal feasting and food sharing hosted by small family groups, as a medium for social bonding and the construction of community. It is also argued that if cooking was the primary activity taken place at these sites, this should not be viewed as a mundane functional activity, but rather one that actively contributed to the constitution of social relations. The formality of the technology is also supported by the presence of possible specialised structures, some of which were used for cooking/feasting while others were for ritualised sweat-bathing. The duration and frequency of activities associated with burnt mounds and the opportunities they provided for social interaction suggest that these sites contributed some familiar frames of reference to contemporary discourse.
Resumo:
The authors analyzed several cytomorphonuclear parameters related to chromatin distribution and DNA ploidy in typical and atypical carcinoids and in small cell lung cancers. Nuclear measurements and analysis were performed with a SAMBA 200 (TITN, Grenoble, France) cell image processor with software allowing the discrimination of parameters computed on cytospin preparations of Feulgen-stained nuclei extracted from deparaffinized tumor tissues. The authors' results indicate a significant increase in DNA content--assessed by integrated optical density (IOD)--from typical carcinoids to small cell lung carcinomas, with atypical carcinoids showing an intermediate value. Parameters related to hyperchromatism (short and long run length and variance of optical density) also characterize the atypical carcinoids as being intermediate between typical carcinoids and small cell lung cancers. The systematic measurement of these cytomorphonuclear parameters seems to define an objective, reproducible "scale" of differentiation that helps to define the atypical carcinoid and may be of value in establishing cytologic criteria for differential diagnosis.
Resumo:
We present measurements of morphological features in a thick turbid sample using light-scattering spectroscopy (LSS) and Fourier-domain low-coherence interferometry (fLCI) by processing with the dual-window (DW) method. A parallel frequency domain optical coherence tomography (OCT) system with a white-light source is used to image a two-layer phantom containing polystyrene beads of diameters 4.00 and 6.98 mum on the top and bottom layers, respectively. The DW method decomposes each OCT A-scan into a time-frequency distribution with simultaneously high spectral and spatial resolution. The spectral information from localized regions in the sample is used to determine scatterer structure. The results show that the two scatterer populations can be differentiated using LSS and fLCI.
Resumo:
BACKGROUND: Sharing of epidemiological and clinical data sets among researchers is poor at best, in detriment of science and community at large. The purpose of this paper is therefore to (1) describe a novel Web application designed to share information on study data sets focusing on epidemiological clinical research in a collaborative environment and (2) create a policy model placing this collaborative environment into the current scientific social context. METHODOLOGY: The Database of Databases application was developed based on feedback from epidemiologists and clinical researchers requiring a Web-based platform that would allow for sharing of information about epidemiological and clinical study data sets in a collaborative environment. This platform should ensure that researchers can modify the information. A Model-based predictions of number of publications and funding resulting from combinations of different policy implementation strategies (for metadata and data sharing) were generated using System Dynamics modeling. PRINCIPAL FINDINGS: The application allows researchers to easily upload information about clinical study data sets, which is searchable and modifiable by other users in a wiki environment. All modifications are filtered by the database principal investigator in order to maintain quality control. The application has been extensively tested and currently contains 130 clinical study data sets from the United States, Australia, China and Singapore. Model results indicated that any policy implementation would be better than the current strategy, that metadata sharing is better than data-sharing, and that combined policies achieve the best results in terms of publications. CONCLUSIONS: Based on our empirical observations and resulting model, the social network environment surrounding the application can assist epidemiologists and clinical researchers contribute and search for metadata in a collaborative environment, thus potentially facilitating collaboration efforts among research communities distributed around the globe.