444 resultados para Virtual Market
Resumo:
The benefits of virtual communities in increasing firms' profits, instilling knowledge in consumers, and enhancing consumers' social experience and enjoyment are widely recognised. However, relatively little is known about how the use of a virtual community could influence consumers' emotional well-being. This study examines the relationships among virtual community features (structural and experiential routes) as antecedents of virtual community engagement, including quality of use of virtual communities (time spent online and level of information exchange), electronic word-of-mouth (eWOM) purchasing behaviour, and consumers' emotional experience. Furthermore, by extending the cultural perspective to virtual community engagement, this study examines the role of collectivistic values on the aforementioned relationships. The proposed hypotheses are tested on the basis of data collected from 286 members of different virtual communities in Taiwan. The results partially support the theory that features of virtual communities influenced the quality of use, which then has a subsequent effect on consumer eWOM purchasing and emotional well-being. The results of the empirical analysis add credence to the proposed relationships. The role of collectivistic values is also partially supported. A detailed discussion of the findings and limitations of this study is provided.
Resumo:
Construction professional service (CPS) in the international arena has been very competitive despite that the industry is proliferating at a high rate. To excel in international business, CPS firms have the importance of building overseas competition strategies on a proper understanding of the international CPS (I-CPS) market. However, subject to borderless trade, information technology–based networking, global outsourcing, and changing forms of procurement, the I-CPS market structure has become more covert, intricate, and unstraightforward than before. Through examining business competition among top international design firms, this study aims to identify the attributes of the I-CPS market structure from two perspectives—concentration and turnover. Data from Engineering News-Record over the period 2001–2011 were collected to calculate market concentration ratios and turnover indices. The results show that I-CPS competition is characterized by atomism, much turbulence with a steady increase in competition intensity, and the predominant role of new entrants and exiting firms in market turnovers. The combination of concentration and turnover is found useful to address the attributes of the I-CPS market structure, which favors I-CPS firms to formulate international competition strategies in due ways.
Resumo:
This is an editorial introduction for a virtual edition focused on neoliberalism in educational sectors for the journal, "Critical Studies in Education". The introduction outlines the nature and progress of neoliberalism, then reviews the selected articles from the journal's archives.
Resumo:
Purpose – The purpose of this paper is to investigate the extent of directors breaching the reporting requirements of the Australian Stock Exchange (ASX) and the Corporations Act in Australia. Further, it seeks to assess whether directors in Australia achieve abnormal returns from trades in their own companies. Design/methodology/approach – Using an event study approach on an Australian sample, abnormal returns for a range of situations were estimated. Findings – A total of 13 (seven) per cent of own‐company directors trades do not meet the ASX (Corporations Act) requirement of reporting within five (14) business days. Directors do achieve abnormal returns through trading in shares of their own companies. Ignoring transaction costs, outsiders can achieve abnormal returns by imitating directors' trades. Analysis of returns to directors after they trade but before they announce the trade to the market shows that directors are making small but statistically significant returns that are not available to the market. Analysis of returns to directors subsequent to the ASX reporting requirement up to the day the trade is reported shows that directors are making small but statistically significant returns that should be available to the market. Research limitations/implications – Future research should investigate the linkages between late reporting by directors and disadvantages to outside shareholders and the implementation of internal policies implemented to mitigate insider trading. Practical implications – Market participants should remain vigilant regarding the potential for late/non‐reporting of directors' trades. Originality/value – Uncovering breaches of reporting regulations are particularly important given that directors tend to purchase (sell) shares when the price is low (high), thereby achieving abnormal returns.
Resumo:
Unlike US and Continental European jurisdictions, Australian monetary policy announcements are not followed promptly by projections materials or comprehensive summaries that explain the decision process. This information is disclosed 2 weeks later when the explanatory minutes of the Reserve Bank board meeting are released. This paper is the first study to exploit the features of the Australian monetary policy environment in order to examine the differential impact of monetary policy announcements and explanatory statements on the Australian interest rate futures market. We find that both monetary policy announcements and explanatory minutes releases have a significant impact on the implied yield and volatility of Australian interest rate futures contracts. When the differential impact of these announcements is examined using the full sample, no statistically significant difference is found. However, when the sample is partitioned based on stable periods and the Global Financial Crisis, a differential impact is evident. Further, contrary to the findings of Kim and Nguyen (2008), Lu et al. (2009), and Smales (2012a), the response along the yield curve, is found to be indifferent between the short and medium terms.
Resumo:
One of the most discussed topics in labour and demographic studies, population ageing and stability, is closely related to fertility choices. This thesis explores recent developments in the fertility literature in the context of Australia. We investigate individual preferences for child bearing, the determinants of fertility decisions and the effectiveness of policies implemented by the government aimed at improving total fertility. The first study highlights the impact of monetary incentives on the decision to bear children in light of potentially differential responses across the native and immigrant population. The second study analyses the role of unemployment and job stability on the fertility choices of women. The final study examines whether the quality-quantity trade-off exists for Australian families and explores the impact of siblings on a child's health and educational outcomes.
Resumo:
This paper examines the dispute between the Seattle company Virtual Countries Inc. and the Republic of South Africa over the ownership of the domain name address southafrica.com. The first part of the paper deals with the pre-emptive litigation taken by Virtual Countries Inc. in a District Court of the United States. The second part considers the possible arbitration of the dispute under the Uniform Domain Name Dispute Resolution Process of the Internet Corporation for Assigned Names and Numbers (ICANN) and examines the wider implications of this dispute for the jurisdiction and the governance of ICANN. The final section of the paper evaluates the Final Report of the Second WIPO Internet Domain Name Process.
Resumo:
In the awarding of the tender for APAM by the Australia Council to Brisbane Powerhouse for the delivery of the market in 2014-2018, a requirement is that a formal evaluation of the three iterations of APAM be undertaken by the Queensland University of Technology, Creative Industries Faculty, under the leadership of Associate Professor Sandra Gattenhof. The agreed research model delivers reporting on outcomes not only in the year in which APAM is delivered (2014, 2016, 2018) but also in the years between (2015, 2017). This inter-year report focuses on the domestic and international touring outcomes resulting from engagement in the 2014 Market and responds two of the three key research foci for the evaluation that are articulated in the Brisbane Powerhouse Tender (2011) document as: • Evaluation of international market development outcomes through showcasing work to targeted international presenters and agents • Evaluation of national market development outcomes through showcasing work to national presenters and producers. The reporting for mid-year 2015, a non-APAM year, collects data from two key sources – six identified case study productions that have been tracked for eighteen months, and an online survey delivered to all APAM 2014 delegates. This inter-year report is a six month follow-up with delegates and identified case studies companies that track the ongoing progress of market outcomes and levers for ongoing improvement of the APAM delivery model that was tabled in the Year One Report delivered to Brisbane Powerhouse in October 2014.
Resumo:
The increase in data center dependent services has made energy optimization of data centers one of the most exigent challenges in today's Information Age. The necessity of green and energy-efficient measures is very high for reducing carbon footprint and exorbitant energy costs. However, inefficient application management of data centers results in high energy consumption and low resource utilization efficiency. Unfortunately, in most cases, deploying an energy-efficient application management solution inevitably degrades the resource utilization efficiency of the data centers. To address this problem, a Penalty-based Genetic Algorithm (GA) is presented in this paper to solve a defined profile-based application assignment problem whilst maintaining a trade-off between the power consumption performance and resource utilization performance. Case studies show that the penalty-based GA is highly scalable and provides 16% to 32% better solutions than a greedy algorithm.
Resumo:
Aim Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment. Ongoing work seeks to determine the impact of simulation on clinical skills.
Resumo:
In the past few years, the virtual machine (VM) placement problem has been studied intensively and many algorithms for the VM placement problem have been proposed. However, those proposed VM placement algorithms have not been widely used in today's cloud data centers as they do not consider the migration cost from current VM placement to the new optimal VM placement. As a result, the gain from optimizing VM placement may be less than the loss of the migration cost from current VM placement to the new VM placement. To address this issue, this paper presents a penalty-based genetic algorithm (GA) for the VM placement problem that considers the migration cost in addition to the energy-consumption of the new VM placement and the total inter-VM traffic flow in the new VM placement. The GA has been implemented and evaluated by experiments, and the experimental results show that the GA outperforms two well known algorithms for the VM placement problem.
Resumo:
Although live VM migration has been intensively studied, the problem of live migration of multiple interdependent VMs has hardly been investigated. The most important problem in the live migration of multiple interdependent VMs is how to schedule VM migrations as the schedule will directly affect the total migration time and the total downtime of those VMs. Aiming at minimizing both the total migration time and the total downtime simultaneously, this paper presents a Strength Pareto Evolutionary Algorithm 2 (SPEA2) for the multi-VM migration scheduling problem. The SPEA2 has been evaluated by experiments, and the experimental results show that the SPEA2 can generate a set of VM migration schedules with a shorter total migration time and a shorter total downtime than an existing genetic algorithm, namely Random Key Genetic Algorithm (RKGA). This paper also studies the scalability of the SPEA2.
Resumo:
By the time students reach the middle years they have experienced many chance activities based on dice. Common among these are rolling one die to explore the relationship of frequency and theoretical probability, and rolling two dice and summing the outcomes to consider their probabilities. Although dice may be considered overused by some, the advantage they offer is a familiar context within which to explore much more complex concepts. If the basic chance mechanism of the device is understood, it is possible to enter quickly into an arena of more complex concepts. This is what happened with a two hour activity engaged in by four classes of Grade 6 students in the same school. The activity targeted the concepts of variation and expectation. The teachers held extended discussions with their classes on variation and expectation at the beginning of the activity, with students contributing examples of the two concepts from their own experience. These notions are quite sophisticated for Grade 6, but the underlying concepts describe phenomena that students encounter every day. For example, time varies continuously; sporting results vary from game to game; the maximum temperature varies from day to day. However, there is an expectation about tomorrow’s maximum temperature based on the expert advice from the weather bureau. There may also be an expectation about a sporting result based on the participants’ previous results. It is this juxtaposition that makes life interesting. Variation hence describes the differences we see in phenomena around us. In a scenario displaying variation, expectation describes the effort to characterise or summarise the variation and perhaps make a prediction about the message arising from the scenario. The explicit purpose of the activity described here was to use the familiar scenario of rolling a die to expose these two concepts. Because the students had previously experienced rolling physical dice they knew instinctively about the variation that occurs across many rolls and about the theoretical expectation that each side should “come up” one-sixth of the time. They had observed the instances of the concepts in action, but had not consolidated the underlying terminology to describe it. As the two concepts are so fundamental to understanding statistics, we felt it would be useful to begin building in the familiar environment of rolling a die. Because hand-held dice limit the explorations students can undertake, the classes used the soft-ware TinkerPlots (Konold & Miller, 2011) to simulate rolling a die multiple times.
Resumo:
In fisheries managed using individual transferable quotas (ITQs) it is generally assumed that quota markets are well-functioning, allowing quota to flow on either a temporary or permanent basis to those able to make best use of it. However, despite an increasing number of fisheries being managed under ITQs, empirical assessments of the quota markets that have actually evolved in these fisheries remain scarce. The Queensland Coral Reef Fin-Fish Fishery (CRFFF) on the Great Barrier Reef has been managed under a system of ITQs since 2004. Data on individual quota holdings and trades for the period 2004-2012 were used to assess the CRFFF quota market and its evolution through time. Network analysis was applied to assess market structure and the nature of lease-trading relationships. An assessment of market participants’ abilities to balance their quota accounts, i.e., gap analysis, provided insights into market functionality and how this may have changed in the period observed. Trends in ownership and trade were determined, and market participants were identified as belonging to one out of a set of seven generalized types. The emergence of groups such as investors and lease-dependent fishers is clear. In 2011-2012, 41% of coral trout quota was owned by participants that did not fish it, and 64% of total coral trout landings were made by fishers that owned only 10% of the quota. Quota brokers emerged whose influence on the market varied with the bioeconomic conditions of the fishery. Throughout the study period some quota was found to remain inactive, implying potential market inefficiencies. Contribution to this inactivity appeared asymmetrical, with most residing in the hands of smaller quota holders. The importance of transaction costs in the operation of the quota market and the inequalities that may result are discussed in light of these findings
Resumo:
Network topology and routing are two important factors in determining the communication costs of big data applications at large scale. As for a given Cluster, Cloud, or Grid system, the network topology is fixed and static or dynamic routing protocols are preinstalled to direct the network traffic. Users cannot change them once the system is deployed. Hence, it is hard for application developers to identify the optimal network topology and routing algorithm for their applications with distinct communication patterns. In this study, we design a CCG virtual system (CCGVS), which first uses container-based virtualization to allow users to create a farm of lightweight virtual machines on a single host. Then, it uses software-defined networking (SDN) technique to control the network traffic among these virtual machines. Users can change the network topology and control the network traffic programmingly, thereby enabling application developers to evaluate their applications on the same system with different network topologies and routing algorithms. The preliminary experimental results through both synthetic big data programs and NPB benchmarks have shown that CCGVS can represent application performance variations caused by network topology and routing algorithm.