902 resultados para Multimedia Super Corridor Malaysia
Resumo:
This thesis is to establish a framework to guide the development of a simulated, multimedia-enriched, immersive, learning environment (SMILE) framework. This framework models essential media components used to describe a scenario applied in healthcare (in a dementia context), demonstrates interactions between the components, and enables scalability of simulation implementation. The thesis outcomes also include a simulation system developed in accordance with the guidance framework and a preliminary evaluation through a user study involving ten nursing students and practicioners. The results show that the proposed framework is feasible and effective for designing a simulation system in dementia healthcare training.
Resumo:
We extend the modeling heuristic of (Harsha et al. 2006. In IEEE IWQoS 06, pp 178 - 187) to evaluate the performance of an IEEE 802.11e infrastructure network carrying packet telephone calls, streaming video sessions and TCP controlled file downloads, using Enhanced Distributed Channel Access (EDCA). We identify the time boundaries of activities on the channel (called channel slot boundaries) and derive a Markov Renewal Process of the contending nodes on these epochs. This is achieved by the use of attempt probabilities of the contending nodes as those obtained from the saturation fixed point analysis of (Ramaiyan et al. 2005. In Proceedings ACM Sigmetrics, `05. Journal version accepted for publication in IEEE TON). Regenerative analysis on this MRP yields the desired steady state performance measures. We then use the MRP model to develop an effective bandwidth approach for obtaining a bound on the size of the buffer required at the video queue of the AP, such that the streaming video packet loss probability is kept to less than 1%. The results obtained match well with simulations using the network simulator, ns-2. We find that, with the default IEEE 802.11e EDCA parameters for access categories AC 1, AC 2 and AC 3, the voice call capacity decreases if even one streaming video session and one TCP file download are initiated by some wireless station. Subsequently, reducing the voice calls increases the video downlink stream throughput by 0.38 Mbps and file download capacity by 0.14 Mbps, for every voice call (for the 11 Mbps PHY). We find that a buffer size of 75KB is sufficient to ensure that the video packet loss probability at the QAP is within 1%.
Resumo:
The mid-December 2006 to late January 2007 flood in southern Peninsular Malaysia was the worst flood in a century and was caused by three extreme precipitation episodes. These extreme precipitation events were mainly associated with strong northeasterly winds over the South China Sea. In all cases, the northeasterlies penetrated anomalously far south and followed almost a straight trajectory. The elevated terrain over Sumatra and southern Peninsular Malaysia caused low-level convergence. The strong easterly winds near Java associated with the Rossby wave-type response to Madden-Julian Oscillation (MJO) inhibited the counter-clockwise turning of the northeasterlies and the formation of the Borneo vortex, which, in turn, enhanced the low-level convergence over the region. The abrupt termination of the Indian Ocean Dipole (IOD) in December 2006 played a secondary role as warmer equatorial Indian Ocean helped in the MJO formation.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
This dissertation concerns the Punan Vuhang, former hunter-gatherers who are now part-time farmers living in an area of remote rainforest in the Malaysian state of Sarawak. It covers two themes: first, examining their methods of securing a livelihood in the rainforest, and second looking at their adaptation to a settled life and agriculture, and their response to rapid and large-scale commercial logging. This study engages the long-running debates among anthropologists and ecologists on whether recent hunting-gathering societies were able to survive in the tropical rainforest without dependence on farming societies for food resources. In the search for evidence, the study poses three questions: What food resources were available to rainforest hunter-gatherers? How did they hunt and gather these foods? How did they cope with periodic food shortages? In fashioning a life in the rainforest, the Punan Vuhang survived resource scarcity by developing adaptive strategies through intensive use of their knowledge of the forest and its resources. They also adopted social practices such as sharing and reciprocity, and resource tenure to sustain themselves without recourse to external sources of food. In the 1960s, the Punan Vuhang settled down in response to external influences arising in part from the Indonesian-Malaysian Confrontation. This, in turn, initiated a series of processes with political, economic and religious implications. However, elements of the traditional economy have remained resilient as the people continue to hunt, fish and gather, and are able to farm on an individual basis, unlike neighboring shifting cultivators who need to cooperate with each other. At the beginning of the 21st century, the Punan Vuhang face a new challenge arising from the issue of rights in the context of the state and national law and large-scale commercial logging in their forest habitat. The future seems bleak as they face the social problems of alcoholism, declining leadership, and dependence on cash income and commodities from the market.
Resumo:
The required professional and ethical pronouncements of accountants mean that auditors need to be competent and exercise due care and skill in the performance of their audits. In this study, we examine what happens when auditors take on more clients than they should, thus raising doubts about their ability to maintain competence and audit quality. Using 2803 observations of Malaysian companies from 2010 to 2013, we find that auditors with multiple clients are associated with lower earnings quality, proxied by total accruals and discretionary accruals. Our results demonstrate that associating client firms’ reported discretionary accruals with individual auditors, rather than their firms or offices, is important in determining audit quality. Moreover, we demonstrate that the disclosure of auditors’ signatures on their reports is useful for assessing auditor quality at the individual level, thus contributing to the debate on the usefulness of having auditor identities on reports.
Resumo:
Bandwidth allocation for multimedia applications in case of network congestion and failure poses technical challenges due to bursty and delay sensitive nature of the applications. The growth of multimedia services on Internet and the development of agent technology have made us to investigate new techniques for resolving the bandwidth issues in multimedia communications. Agent technology is emerging as a flexible promising solution for network resource management and QoS (Quality of Service) control in a distributed environment. In this paper, we propose an adaptive bandwidth allocation scheme for multimedia applications by deploying the static and mobile agents. It is a run-time allocation scheme that functions at the network nodes. This technique adaptively finds an alternate patchup route for every congested/failed link and reallocates the bandwidth for the affected multimedia applications. The designed method has been tested (analytical and simulation)with various network sizes and conditions. The results are presented to assess the performance and effectiveness of the approach. This work also demonstrates some of the benefits of the agent based schemes in providing flexibility, adaptability, software reusability, and maintainability. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
We provide a survey of some of our recent results ([9], [13], [4], [6], [7]) on the analytical performance modeling of IEEE 802.11 wireless local area networks (WLANs). We first present extensions of the decoupling approach of Bianchi ([1]) to the saturation analysis of IEEE 802.11e networks with multiple traffic classes. We have found that even when analysing WLANs with unsaturated nodes the following state dependent service model works well: when a certain set of nodes is nonempty, their channel attempt behaviour is obtained from the corresponding fixed point analysis of the saturated system. We will present our experiences in using this approximation to model multimedia traffic over an IEEE 802.11e network using the enhanced DCF channel access (EDCA) mechanism. We have found that we can model TCP controlled file transfers, VoIP packet telephony, and streaming video in the IEEE802.11e setting by this simple approximation.
Resumo:
Polymeric admixtures to concrete ingredients modify the properties of the processed concrete. Ductility is one such property modification. This investigation deals with the development of a method of incorporating natural rubber latex into concrete ingredients with only marginal effects on the compressive strength of base plain concrete. This retention of the strength has been effected by reducing the water/cement ratio with the aid of a superplasticizer. The quantity of natural rubber latex is expressed as the dry rubber content by percentage of volume of concrete. The compressive and tensile strengths, as well as post peak ductile behaviour have been the basis for comparison with those of unmodified concrete.
Resumo:
CD-ROMs have proliferated as a distribution media for desktop machines for a large variety of multimedia applications (targeted for a single-user environment) like encyclopedias, magazines and games. With CD-ROM capacities up to 3 GB being available in the near future, they will form an integral part of Video on Demand (VoD) servers to store full-length movies and multimedia. In the first section of this paper we look at issues related to the single- user desktop environment. Since these multimedia applications are highly interactive in nature, we take a pragmatic approach, and have made a detailed study of the multimedia application behavior in terms of the I/O request patterns generated to the CD-ROM subsystem by tracing these patterns. We discuss prefetch buffer design and seek time characteristics in the context of the analysis of these traces. We also propose an adaptive main-memory hosted cache that receives caching hints from the application to reduce the latency when the user moves from one node of the hyper graph to another. In the second section we look at the use of CD-ROM in a VoD server and discuss the problem of scheduling multiple request streams and buffer management in this scenario. We adapt the C-SCAN (Circular SCAN) algorithm to suit the CD-ROM drive characteristics and prove that it is optimal in terms of buffer size management. We provide computationally inexpensive relations by which this algorithm can be implemented. We then propose an admission control algorithm which admits new request streams without disrupting the continuity of playback of the previous request streams. The algorithm also supports operations such as fast forward and replay. Finally, we discuss the problem of optimal placement of MPEG streams on CD-ROMs in the third section.
Resumo:
We have imaged the H92alpha and H75alpha radio recombination line (RRL) emissions from the starburst galaxy NGC 253 with a resolution of similar to4 pc. The peak of the RRL emission at both frequencies coincides with the unresolved radio nucleus. Both lines observed toward the nucleus are extremely wide, with FWHMs of similar to200 km s(-1). Modeling the RRL and radio continuum data for the radio nucleus shows that the lines arise in gas whose density is similar to10(4) cm(-3) and mass is a few thousand M., which requires an ionizing flux of (6-20) x 10(51) photons s(-1). We consider a supernova remnant (SNR) expanding in a dense medium, a star cluster, and also an active galactic nucleus (AGN) as potential ionizing sources. Based on dynamical arguments, we rule out an SNR as a viable ionizing source. A star cluster model is considered, and the dynamics of the ionized gas in a stellar-wind driven structure are investigated. Such a model is only consistent with the properties of the ionized gas for a cluster younger than similar to10(5) yr. The existence of such a young cluster at the nucleus seems improbable. The third model assumes the ionizing source to be an AGN at the nucleus. In this model, it is shown that the observed X-ray flux is too weak to account for the required ionizing photon flux. However, the ionization requirement can be explained if the accretion disk is assumed to have a big blue bump in its spectrum. Hence, we favor an AGN at the nucleus as the source responsible for ionizing the observed RRLs. A hybrid model consisting of an inner advection-dominated accretion flow disk and an outer thin disk is suggested, which could explain the radio, UV, and X-ray luminosities of the nucleus.
Resumo:
In this paper, we study how TCP and UDP flows interact with each other when the end system is a CPU resource constrained thin client. The problem addressed is twofold, 1) the throughput of TCP flows degrades severely in the presence of heavily loaded UDP flows 2) fairness and minimum QoS requirements of UDP are not maintained. First, we identify the factors affecting the TCP throughput by providing an in-depth analysis of end to end delay and packet loss variations. The results obtained from the first part leads us to our second contribution. We propose and study the use of an algorithm that ensures fairness across flows. The algorithm improves the performance of TCP flows in the presence of multiple UDP flows admitted under an admission algorithm and maintains the minimum QoS requirements of the UDP flows. The advantage of the algorithm is that it requires no changes to TCP/IP stack and control is achieved through receiver window control.
Resumo:
This paper addresses the problem of how to select the optimal number of sensors and how to determine their placement in a given monitored area for multimedia surveillance systems. We propose to solve this problem by obtaining a novel performance metric in terms of a probability measure for accomplishing the task as a function of set of sensors and their placement. This measure is then used to find the optimal set. The same measure can be used to analyze the degradation in system 's performance with respect to the failure of various sensors. We also build a surveillance system using the optimal set of sensors obtained based on the proposed design methodology. Experimental results show the effectiveness of the proposed design methodology in selecting the optimal set of sensors and their placement.