825 resultados para Leadership, leader, power, company, communication, competitiveness.
Resumo:
The competitiveness of the construction industry is an important issue for many countries as the industry makes up a substantial part of their GDP – about 8% in the UK. A number of competitiveness studies have been undertaken at company, industry and national levels. However, there has been little focus on sustainable competitiveness and the many factors that are involved. This paper addresses that need by investigating what construction industry experts consider to be the most important factors of construction industry competitiveness. It does so by conducting a Delphi survey among industry experts in Finland, Sweden and the UK. A list of 158 factors was compiled from competitiveness reports by institutions such as World Economic Forum and International Institute of Management Development, as well as from explorative workshops in the countries involved in the study. For each of the countries, experts with different perspectives of the industry, including, consultants, contractors and clients, were asked to select their 30 most influential factors. They then ranked their chosen factors in order of importance for the competitiveness of their construction industry. The findings after the first round of the Delphi process underline the complexity of the term competitiveness and the wide range of factors that are considered important contributors to competitiveness. The results also indicate that what are considered to be the most important factors of competitiveness is likely to differ from one country to another.
Resumo:
Ever since man invented writing he has used text to store and distribute his thoughts. With the advent of computers and the Internet the delivery of these messages has become almost instant. Textual conversations can now be had regardless of location or distance. Advances in computational power for 3D graphics are enabling Virtual Environments(VE) within which users can become increasingly more immersed. By opening these environments to other users such as initially through sharing these text conversations channels, we aim to extend the immersed experience into an online virtual community. This paper examines work that brings textual communications into the VE, enabling interaction between the real and virtual worlds.
Resumo:
In this paper we introduce a new Wiener system modeling approach for memory high power amplifiers in communication systems using observational input/output data. By assuming that the nonlinearity in the Wiener model is mainly dependent on the input signal amplitude, the complex valued nonlinear static function is represented by two real valued B-spline curves, one for the amplitude distortion and another for the phase shift, respectively. The Gauss-Newton algorithm is applied for the parameter estimation, which incorporates the De Boor algorithm, including both the B-spline curve and the first order derivatives recursion. An illustrative example is utilized to demonstrate the efficacy of the proposed approach.
Resumo:
With the advance of information technology capabilities, and the importance of human computer interfaces within society there has been a significant increase in research activity within the field of human computer interaction (HCI). This paper summarizes some of the work undertaken to date, paying particular attention to methods applicable to on-line control and monitoring systems such as those employed by The National Grid Company plc.
Resumo:
A new generation of advanced surveillance systems is being conceived as a collection of multi-sensor components such as video, audio and mobile robots interacting in a cooperating manner to enhance situation awareness capabilities to assist surveillance personnel. The prominent issues that these systems face are: the improvement of existing intelligent video surveillance systems, the inclusion of wireless networks, the use of low power sensors, the design architecture, the communication between different components, the fusion of data emerging from different type of sensors, the location of personnel (providers and consumers) and the scalability of the system. This paper focuses on the aspects pertaining to real-time distributed architecture and scalability. For example, to meet real-time requirements, these systems need to process data streams in concurrent environments, designed by taking into account scheduling and synchronisation. The paper proposes a framework for the design of visual surveillance systems based on components derived from the principles of Real Time Networks/Data Oriented Requirements Implementation Scheme (RTN/DORIS). It also proposes the implementation of these components using the well-known middleware technology Common Object Request Broker Architecture (CORBA). Results using this architecture for video surveillance are presented through an implemented prototype.
Resumo:
Linear models of market performance may be misspecified if the market is subdivided into distinct regimes exhibiting different behaviour. Price movements in the US Real Estate Investment Trusts and UK Property Companies Markets are explored using a Threshold Autoregressive (TAR) model with regimes defined by the real rate of interest. In both US and UK markets, distinctive behaviour emerges, with the TAR model offering better predictive power than a more conventional linear autoregressive model. The research points to the possibility of developing trading rules to exploit the systematically different behaviour across regimes.
Resumo:
Conventional economic theory, applied to information released by listed companies, equates ‘useful’ with ‘price-sensitive’. Stock exchange rules accordingly prohibit the selec- tive, private communication of price-sensitive information. Yet, even in the absence of such communication, UK equity fund managers routinely meet privately with the senior execu- tives of the companies in which they invest. Moreover, they consider these brief, formal and formulaic meetings to be their most important sources of investment information. In this paper we ask how that can be. Drawing on interview and observation data with fund managers and CFOs, we find evidence for three, non-mutually exclusive explanations: that the characterisation of information in conventional economic theory is too restricted, that fund managers fail to act with the rationality that conventional economic theory assumes, and/or that the primary value of the meetings for fund managers is not related to their investment decision making but to the claims of superior knowledge made to clients in marketing their active fund management expertise. Our findings suggest a disconnect between economic theory and economic policy based on that theory, as well as a corre- sponding limitation in research studies that test information-usefulness by assuming it to be synonymous with price-sensitivity. We draw implications for further research into the role of tacit knowledge in equity investment decision-making, and also into the effects of the principal–agent relationship between fund managers and their clients.
Resumo:
Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element
Resumo:
Sri Lanka's participation rates in higher education are low and have risen only slightly in the last few decades; the number of places for higher education in the state university system only caters for around 3% of the university entrant age cohort. The literature reveals that the highly competitive global knowledge economy increasingly favours workers with high levels of education who are also lifelong learners. This lack of access to higher education for a sizable proportion of the labour force is identified as a severe impediment to Sri Lanka‟s competitiveness in the global knowledge economy. The literature also suggests that Information and Communication Technologies are increasingly relied upon in many contexts in order to deliver flexible learning, to cater especially for the needs of lifelong learners in today‟s higher educational landscape. The government of Sri Lanka invested heavily in ICTs for distance education during the period 2003-2009 in a bid to increase access to higher education; but there has been little research into the impact of this. To address this lack, this study investigated the impact of ICTs on distance education in Sri Lanka with respect to increasing access to higher education. In order to achieve this aim, the research focused on Sri Lanka‟s effort from three perspectives: policy perspective, implementation perspective and user perspective. A multiple case study research using an ethnographic approach was conducted to observe Orange Valley University‟s and Yellow Fields University‟s (pseudonymous) implementation of distance education programmes using questionnaires, qualitative interviewing and document analysis. In total, data for the analysis was collected from 129 questionnaires, 33 individual interviews and 2 group interviews. The research revealed that ICTs have indeed increased opportunities for higher education; but mainly for people of affluent families from the Western Province. Issues identified were categorized under the themes: quality assurance, location, language, digital literacies and access to resources. Recommendations were offered to tackle the identified issues in accordance with the study findings. The study also revealed the strong presence of a multifaceted digital divide in the country. In conclusion, this research has shown that iii although ICT-enabled distance education has the potential to increase access to higher education the present implementation of the system in Sri Lanka has been less than successful.
Resumo:
The Pax Americana and the grand strategy of hegemony (or “Primacy”) that underpins it may be becoming unsustainable. Particularly in the wake of exhausting wars, the Global Financial Crisis, and the shift of wealth from West to East, it may no longer be possible or prudent for the United States to act as the unipolar sheriff or guardian of a world order. But how viable are the alternatives, and what difficulties will these alternatives entail in their design and execution? This analysis offers a sympathetic but critical analysis of alternative U.S. National Security Strategies of “retrenchment” that critics of American diplomacy offer. In these strategies, the United States would anticipate the coming of a more multipolar world and organize its behavior around the dual principles of “concert” and “balance,” seeking a collaborative relationship with other great powers, while being prepared to counterbalance any hostile aggressor that threatens world order. The proponents of such strategies argue that by scaling back its global military presence and its commitments, the United States can trade prestige for security, shift burdens, and attain a more free hand. To support this theory, they often look to the 19th-century concert of Europe as a model of a successful security regime and to general theories about the natural balancing behavior of states. This monograph examines this precedent and measures its usefulness for contemporary statecraft to identify how great power concerts are sustained and how they break down. The project also applies competing theories to how states might behave if world politics are in transition: Will they balance, bandwagon, or hedge? This demonstrates the multiple possible futures that could shape and be shaped by a new strategy. viii A new strategy based on an acceptance of multipolarity and the limits of power is prudent. There is scope for such a shift. The convergence of several trends—including transnational problems needing collaborative efforts, the military advantages of defenders, the reluctance of states to engage in unbridled competition, and hegemony fatigue among the American people—means that an opportunity exists internationally and at home for a shift to a new strategy. But a Concert-Balance strategy will still need to deal with several potential dilemmas. These include the difficulty of reconciling competitive balancing with cooperative concerts, the limits of balancing without a forward-reaching onshore military capability, possible unanticipated consequences such as a rise in regional power competition or the emergence of blocs (such as a Chinese East Asia or an Iranian Gulf), and the challenge of sustaining domestic political support for a strategy that voluntarily abdicates world leadership. These difficulties can be mitigated, but they must be met with pragmatic and gradual implementation as well as elegant theorizing and the need to avoid swapping one ironclad, doctrinaire grand strategy for another.
Resumo:
We apply an alternating proposals protocol with a confirmation stage as a way of solving a Prisoner’s Dilemma game. We interpret players’ proposals and (no) confirmation of outcomes of the game as a tacit communication device. The protocol leads to unprecedented high levels of cooperation in the laboratory. Assigning the power of confirmation to one of the two players alone, rather than alternating the role of a leader significantly increases the probability of signing a cooperative agreement in the first bargaining period. We interpret pre-agreement strategies as tacit messages on players’ willingness to cooperate and on their beliefs about the others’ type.
Resumo:
We suggest an alternating proposals protocol with a confirmation stage as a way of solving a Prisoner's Dilemma game. We interpret players' proposals and (no) confirmation of outcomes of the game as a tacit communication device. The protocol leads to unprecedented high levels of cooperation in the laboratory. Assigning the power of confirmation to one of the two players alone, rather than alternating the role of a leader significantly increases the probability of cooperation in the first bargaining period. We interpret pre-agreement strategies as tacit messages on players' willingness to cooperate and as signals pursuing individualistic objectives like publicizing one's bargaining abilities or eliciting those of the opponent.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.