837 resultados para 3D multi-user virtual environments
Resumo:
Global challenges, complexity and continuous uncertainty demand development of leadership approaches, employees and multi-organisation constellations. Current leadership theories do not sufficiently address the needs of complex business environments. First of all, before successful leadership models can be applied in practice, leadership needs to shift from the industrial age to the knowledge era. Many leadership models still view leadership solely through the perspective of linear process thinking. In addition, there is not enough knowledge or experience in applying these newer models in practice. Leadership theories continue to be based on the assumption that leaders possess or have access to all the relevant knowledge and capabilities to decide future directions without external advice. In many companies, however, the workforce consists of skilled professionals whose work and related interfaces are so challenging that the leaders cannot grasp all the linked viewpoints and cross-impacts alone. One of the main objectives of this study is to understand how to support participants in organisations and their stakeholders to, through practice-based innovation processes, confront various environments. Another aim is to find effective ways of recognising and reacting to diverse contexts, so companies and other stakeholders are better able to link to knowledge flows and shared value creation processes in advancing joint value to their customers. The main research question of this dissertation is, then, to seek understanding of how to enhance leadership in complex environments. The dissertation can, on the whole, be characterised as a qualitative multiple-case study. The research questions and objectives were investigated through six studies published in international scientific journals. The main methods applied were interviews, action research and a survey. The empirical focus was on Finnish companies, and the research questions were examined in various organisations at the top levels (leaders and managers) and bottom levels (employees) in the context of collaboration between organisations and cooperation between case companies and their client organisations. However, the emphasis of the analysis is the internal and external aspects of organisations, which are conducted in practice-based innovation processes. The results of this study suggest that the Cynefin framework, complexity leadership theory and transformational leadership represent theoretical models applicable to developing leadership through practice-based innovation. In and of themselves, they all support confronting contemporary challenges, but an implementable method for organisations may be constructed by assimilating them into practice-based innovation processes. Recognition of diverse environments, their various contexts and roles in the activities and collaboration of organisations and their interest groups is ever-more important to achieving better interaction in which a strategic or formal status may be bypassed. In innovation processes, it is not necessarily the leader who is in possession of the essential knowledge; thus, it is the role of leadership to offer methods and arenas where different actors may generate advances. Enabling and supporting continuous interaction and integrated knowledge flows is of crucial importance, to achieve emergence of innovations in the activities of organisations and various forms of collaboration. The main contribution of this dissertation relates to applying these new conceptual models in practice. Empirical evidence on the relevance of different leadership roles in practice-based innovation processes in Finnish companies is another valuable contribution. Finally, the dissertation sheds light on the significance of combining complexity science with leadership and innovation theories in research.
Resumo:
This study discusses the procedures of value co-creation that persist in gaming industry. The purpose of this study was to identify the procedures that persist in current video gaming industry which answers the main research problem how value is co-created in video gaming industry followed by three sub questions: (i) What is value co-creation in gaming industry? (ii) Who participates in value co-creation in gaming industry? (iii) What are the procedures that are involved in value co-creation in gaming industry? The theoretical background of the study consists of literature relating to the theory of marketing i.e., notion of value, conventional understanding of value creation, value chain, co-creation approach, co-production approach. The research adopted qualitative research approach. As a platform of relationship researcher used web 2.0 tool interface. Data were collected from the social networks and netnography method was applied for analyzing them. Findings show that customer and company both co-create optimum level of value while they interact with each other and within the customers as well. However mostly the C2C interaction, discussions and dialogues threads that emerged around the main discussion facilitated to co-create value. In this manner, companies require exploiting and further motivating, developing and supporting the interactions between customers participating in value creation. Hierarchy of value co-creation processes is the result derived from the identified challenges of value co-creation approach and discussion forums data analysis. Overall three general sets and seven topics were found that explored the phenomenon of customer to customer (C2C) and business to customer (B2C) interaction/debating for value co-creation through user generated contents. These topics describe how gamer contributes and interacts in co-creating value along with companies. A methodical quest in current research literature acknowledged numerous evolving flows of value in this study. These are general management perspective, new product development and innovation, virtual customer environment, service science and service dominant logic. Overall the topics deliver various realistic and conceptual implications for using and handling gamers in social networks for augmenting customers’ value co-creation process.
Resumo:
One of the main challenges in Software Engineering is to cope with the transition from an industry based on software as a product to software as a service. The field of Software Engineering should provide the necessary methods and tools to develop and deploy new cost-efficient and scalable digital services. In this thesis, we focus on deployment platforms to ensure cost-efficient scalability of multi-tier web applications and on-demand video transcoding service for different types of load conditions. Infrastructure as a Service (IaaS) clouds provide Virtual Machines (VMs) under the pay-per-use business model. Dynamically provisioning VMs on demand allows service providers to cope with fluctuations on the number of service users. However, VM provisioning must be done carefully, because over-provisioning results in an increased operational cost, while underprovisioning leads to a subpar service. Therefore, our main focus in this thesis is on cost-efficient VM provisioning for multi-tier web applications and on-demand video transcoding. Moreover, to prevent provisioned VMs from becoming overloaded, we augment VM provisioning with an admission control mechanism. Similarly, to ensure efficient use of provisioned VMs, web applications on the under-utilized VMs are consolidated periodically. Thus, the main problem that we address is cost-efficient VM provisioning augmented with server consolidation and admission control on the provisioned VMs. We seek solutions for two types of applications: multi-tier web applications that follow the request-response paradigm and on-demand video transcoding that is based on video streams with soft realtime constraints. Our first contribution is a cost-efficient VM provisioning approach for multi-tier web applications. The proposed approach comprises two subapproaches: a reactive VM provisioning approach called ARVUE and a hybrid reactive-proactive VM provisioning approach called Cost-efficient Resource Allocation for Multiple web applications with Proactive scaling. Our second contribution is a prediction-based VM provisioning approach for on-demand video transcoding in the cloud. Moreover, to prevent virtualized servers from becoming overloaded, the proposed VM provisioning approaches are augmented with admission control approaches. Therefore, our third contribution is a session-based admission control approach for multi-tier web applications called adaptive Admission Control for Virtualized Application Servers. Similarly, the fourth contribution in this thesis is a stream-based admission control and scheduling approach for on-demand video transcoding called Stream-Based Admission Control and Scheduling. Our fifth contribution is a computation and storage trade-o strategy for cost-efficient video transcoding in cloud computing. Finally, the sixth and the last contribution is a web application consolidation approach, which uses Ant Colony System to minimize the under-utilization of the virtualized application servers.
Resumo:
Digital business ecosystems (DBE) are becoming an increasingly popular concept for modelling and building distributed systems in heterogeneous, decentralized and open environments. Information- and communication technology (ICT) enabled business solutions have created an opportunity for automated business relations and transactions. The deployment of ICT in business-to-business (B2B) integration seeks to improve competitiveness by establishing real-time information and offering better information visibility to business ecosystem actors. The products, components and raw material flows in supply chains are traditionally studied in logistics research. In this study, we expand the research to cover the processes parallel to the service and information flows as information logistics integration. In this thesis, we show how better integration and automation of information flows enhance the speed of processes and, thus, provide cost savings and other benefits for organizations. Investments in DBE are intended to add value through business automation and are key decisions in building up information logistics integration. Business solutions that build on automation are important sources of value in networks that promote and support business relations and transactions. Value is created through improved productivity and effectiveness when new, more efficient collaboration methods are discovered and integrated into DBE. Organizations, business networks and collaborations, even with competitors, form DBE in which information logistics integration has a significant role as a value driver. However, traditional economic and computing theories do not focus on digital business ecosystems as a separate form of organization, and they do not provide conceptual frameworks that can be used to explore digital business ecosystems as value drivers—combined internal management and external coordination mechanisms for information logistics integration are not the current practice of a company’s strategic process. In this thesis, we have developed and tested a framework to explore the digital business ecosystems developed and a coordination model for digital business ecosystem integration; moreover, we have analysed the value of information logistics integration. The research is based on a case study and on mixed methods, in which we use the Delphi method and Internetbased tools for idea generation and development. We conducted many interviews with key experts, which we recoded, transcribed and coded to find success factors. Qualitative analyses were based on a Monte Carlo simulation, which sought cost savings, and Real Option Valuation, which sought an optimal investment program for the ecosystem level. This study provides valuable knowledge regarding information logistics integration by utilizing a suitable business process information model for collaboration. An information model is based on the business process scenarios and on detailed transactions for the mapping and automation of product, service and information flows. The research results illustrate the current cap of understanding information logistics integration in a digital business ecosystem. Based on success factors, we were able to illustrate how specific coordination mechanisms related to network management and orchestration could be designed. We also pointed out the potential of information logistics integration in value creation. With the help of global standardization experts, we utilized the design of the core information model for B2B integration. We built this quantitative analysis by using the Monte Carlo-based simulation model and the Real Option Value model. This research covers relevant new research disciplines, such as information logistics integration and digital business ecosystems, in which the current literature needs to be improved. This research was executed by high-level experts and managers responsible for global business network B2B integration. However, the research was dominated by one industry domain, and therefore a more comprehensive exploration should be undertaken to cover a larger population of business sectors. Based on this research, the new quantitative survey could provide new possibilities to examine information logistics integration in digital business ecosystems. The value activities indicate that further studies should continue, especially with regard to the collaboration issues on integration, focusing on a user-centric approach. We should better understand how real-time information supports customer value creation by imbedding the information into the lifetime value of products and services. The aim of this research was to build competitive advantage through B2B integration to support a real-time economy. For practitioners, this research created several tools and concepts to improve value activities, information logistics integration design and management and orchestration models. Based on the results, the companies were able to better understand the formulation of the digital business ecosystem and the importance of joint efforts in collaboration. However, the challenge of incorporating this new knowledge into strategic processes in a multi-stakeholder environment remains. This challenge has been noted, and new projects have been established in pursuit of a real-time economy.
Resumo:
This thesis presents a novel design paradigm, called Virtual Runtime Application Partitions (VRAP), to judiciously utilize the on-chip resources. As the dark silicon era approaches, where the power considerations will allow only a fraction chip to be powered on, judicious resource management will become a key consideration in future designs. Most of the works on resource management treat only the physical components (i.e. computation, communication, and memory blocks) as resources and manipulate the component to application mapping to optimize various parameters (e.g. energy efficiency). To further enhance the optimization potential, in addition to the physical resources we propose to manipulate abstract resources (i.e. voltage/frequency operating point, the fault-tolerance strength, the degree of parallelism, and the configuration architecture). The proposed framework (i.e. VRAP) encapsulates methods, algorithms, and hardware blocks to provide each application with the abstract resources tailored to its needs. To test the efficacy of this concept, we have developed three distinct self adaptive environments: (i) Private Operating Environment (POE), (ii) Private Reliability Environment (PRE), and (iii) Private Configuration Environment (PCE) that collectively ensure that each application meets its deadlines using minimal platform resources. In this work several novel architectural enhancements, algorithms and policies are presented to realize the virtual runtime application partitions efficiently. Considering the future design trends, we have chosen Coarse Grained Reconfigurable Architectures (CGRAs) and Network on Chips (NoCs) to test the feasibility of our approach. Specifically, we have chosen Dynamically Reconfigurable Resource Array (DRRA) and McNoC as the representative CGRA and NoC platforms. The proposed techniques are compared and evaluated using a variety of quantitative experiments. Synthesis and simulation results demonstrate VRAP significantly enhances the energy and power efficiency compared to state of the art.
Resumo:
Meandering rivers have been perceived to evolve rather similarly around the world independently of the location or size of the river. Despite the many consistent processes and characteristics they have also been noted to show complex and unique sets of fluviomorphological processes in which local factors play important role. These complex interactions of flow and morphology affect notably the development of the river. Comprehensive and fundamental field, flume and theoretically based studies of fluviomorphological processes in meandering rivers have been carried out especially during the latter part of the 20th century. However, as these studies have been carried out with traditional field measurements techniques their spatial and temporal resolution is not competitive to the level achievable today. The hypothesis of this study is that, by exploiting e increased spatial and temporal resolution of the data, achieved by combining conventional field measurements with a range of modern technologies, will provide new insights to the spatial patterns of the flow-sediment interaction in meandering streams, which have perceived to show notable variation in space and time. This thesis shows how the modern technologies can be combined to derive very high spatial and temporal resolution data on fluvio-morphological processes over meander bends. The flow structure over the bends is recorded in situ using acoustic Doppler current profiler (ADCP) and the spatial and temporal resolution of the flow data is enhanced using 2D and 3D CFD over various meander bends. The CFD are also exploited to simulate sediment transport. Multi-temporal terrestrial laser scanning (TLS), mobile laser scanning (MLS) and echo sounding data are used to measure the flow-based changes and formations over meander bends and to build the computational models. The spatial patterns of erosion and deposition over meander bends are analysed relative to the measured and modelled flow field and sediment transport. The results are compared with the classic theories of the processes in meander bends. Mainly, the results of this study follow well the existing theories and results of previous studies. However, some new insights regarding to the spatial and temporal patterns of the flow-sediment interaction in a natural sand-bed meander bend are provided. The results of this study show the advantages of the rapid and detailed measurements techniques and the achieved spatial and temporal resolution provided by CFD, unachievable with field measurements. The thesis also discusses the limitations which remain in the measurement and modelling methods and in understanding of fluvial geomorphology of meander bends. Further, the hydro- and morphodynamic models’ sensitivity to user-defined parameters is tested, and the modelling results are assessed against detailed field measurement. The study is implemented in the meandering sub-Arctic Pulmanki River in Finland. The river is unregulated and sand-bed and major morphological changes occur annually on the meander point bars, which are inundated only during the snow-melt-induced spring floods. The outcome of this study applies to sandbed meandering rivers in regions where normally one significant flood event occurs annually, such as Arctic areas with snow-melt induced spring floods, and where the point bars of the meander bends are inundated only during the flood events.
Resumo:
The emergence of depth sensors has made it possible to track – not only monocular cues – but also the actual depth values of the environment. This is especially useful in augmented reality solutions, where the position and orientation (pose) of the observer need to be accurately determined. This allows virtual objects to be installed to the view of the user through, for example, a screen of a tablet or augmented reality glasses (e.g. Google glass, etc.). Although the early 3D sensors have been physically quite large, the size of these sensors is decreasing, and possibly – eventually – a 3D sensor could be embedded – for example – to augmented reality glasses. The wider subject area considered in this review is 3D SLAM methods, which take advantage of the 3D information available by modern RGB-D sensors, such as Microsoft Kinect. Thus the review for SLAM (Simultaneous Localization and Mapping) and 3D tracking in augmented reality is a timely subject. We also try to find out the limitations and possibilities of different tracking methods, and how they should be improved, in order to allow efficient integration of the methods to the augmented reality solutions of the future.
Resumo:
This thesis addresses the coolability of porous debris beds in the context of severe accident management of nuclear power reactors. In a hypothetical severe accident at a Nordic-type boiling water reactor, the lower drywell of the containment is flooded, for the purpose of cooling the core melt discharged from the reactor pressure vessel in a water pool. The melt is fragmented and solidified in the pool, ultimately forming a porous debris bed that generates decay heat. The properties of the bed determine the limiting value for the heat flux that can be removed from the debris to the surrounding water without the risk of re-melting. The coolability of porous debris beds has been investigated experimentally by measuring the dryout power in electrically heated test beds that have different geometries. The geometries represent the debris bed shapes that may form in an accident scenario. The focus is especially on heap-like, realistic geometries which facilitate the multi-dimensional infiltration (flooding) of coolant into the bed. Spherical and irregular particles have been used to simulate the debris. The experiments have been modeled using 2D and 3D simulation codes applicable to fluid flow and heat transfer in porous media. Based on the experimental and simulation results, an interpretation of the dryout behavior in complex debris bed geometries is presented, and the validity of the codes and models for dryout predictions is evaluated. According to the experimental and simulation results, the coolability of the debris bed depends on both the flooding mode and the height of the bed. In the experiments, it was found that multi-dimensional flooding increases the dryout heat flux and coolability in a heap-shaped debris bed by 47–58% compared to the dryout heat flux of a classical, top-flooded bed of the same height. However, heap-like beds are higher than flat, top-flooded beds, which results in the formation of larger steam flux at the top of the bed. This counteracts the effect of the multi-dimensional flooding. Based on the measured dryout heat fluxes, the maximum height of a heap-like bed can only be about 1.5 times the height of a top-flooded, cylindrical bed in order to preserve the direct benefit from the multi-dimensional flooding. In addition, studies were conducted to evaluate the hydrodynamically representative effective particle diameter, which is applied in simulation models to describe debris beds that consist of irregular particles with considerable size variation. The results suggest that the effective diameter is small, closest to the mean diameter based on the number or length of particles.
Resumo:
User experience is a crucial element in interactive storytelling, and as such it is important to recognize the different aspects of a positive user experience in an interactive story. Towards that goal, in the first half of this thesis, we will go through the different elements that make up the user experience, with a strong focus on agency. Agency can be understood as the user’s ability to affect the story or the world in which the story is told with interesting and satisfying choices. The freedoms granted by agency are not completely compatible with traditional storytelling, and as such we will also go through some of the issues of agency-centric design philosophies and explore alternate schools of thought. The core purpose of this thesis is to determine the most important aspects of agency with regards to a positive user experience and attempt to find ways for authors to improve the overall quality of user experience in interactive stories. The latter half of this thesis deals with the research conducted on this matter. This research was carried out by analyzing data from an online survey coupled with data gathered by the interactive storytelling system specifically made for this research (Regicide). The most important aspects of this research deal with influencing perceived agency and facilitating an illusion of agency in different ways, and comparing user experiences in these different test environments. The most important findings based on this research include the importance of context-controlled and focused agency and settings in which the agency takes place and the importance of ensuring user-competency within an interactive storytelling system. Another essential conclusion to this research boils down to communication between the user and the system; the goal of influencing perceived agency should primarily be to ensure that the user is aware of all the theoretical agency they possess.
Resumo:
The last two decades have provided a vast opportunity to live and explore the compulsive imaginary world or virtual world through massively multiplayer online role-playing games (MMORPGs). MMORPG gives a wide range of opportunities to its users to participate with multi-players on the same platform, to communicate and to do real time actions. There is a virtual economy in these games which is largely player-driven. In-game currency provides its users to build up their Avatars, to buy or sell the necessary goods to play, survive in the games and so on. As a part of virtual economies generated through EVE Online, this thesis mainly focuses on how the prices of the minerals in EVE Online behave by applying the Jabłonska- Capasso-Morale (JCM) mathematical simulation model. It is to verify up to what degree the model can reproduce the virtual economy behavior. The model is applied to buy and sell prices of two minerals namely, isogen and morphite. The simulation results demonstrate that JCM model ts reasonably well to the mineral prices, which lets us conclude that virtual economies behave similarly to the real ones.
Resumo:
Objective: Overuse injuries in violinists are a problem that has been primarily analyzed through the use of questionnaires. Simultaneous 3D motion analysis and EMG to measure muscle activity has been suggested as a quantitative technique to explore this problem by identifying movement patterns and muscular demands which may predispose violinists to overuse injuries. This multi-disciplinary analysis technique has, so far, had limited use in the music world. The purpose of this study was to use it to characterize the demands of a violin bowing task. Subjects: Twelve injury-free violinists volunteered for the study. The subjects were assigned to a novice or expert group based on playing experience, as determined by questionnaire. Design and Settings: Muscle activity and movement patterns were assessed while violinists played five bowing cycles (one bowing cycle = one down-bow + one up-bow) on each string (G, D, A, E), at a pulse of 4 beats per bow and 100 beats per minute. Measurements: An upper extremity model created using coordinate data from markers placed on the right acromion process, lateral epicondyle of the humerus and ulnar styloid was used to determine minimum and maximum joint angles, ranges of motion (ROM) and angular velocities at the shoulder and elbow of the bowing arm. Muscle activity in right anterior deltoid, biceps brachii and triceps brachii was assessed during maximal voluntary contractions (MVC) and during the playing task. Data were analysed for significant differences across the strings and between experience groups. Results: Elbow flexion/extension ROM was similar across strings for both groups. Shoulder flexion/extension ROM increaslarger for the experts. Angular velocity changes mirrored changes in ROM. Deltoid was the most active of the muscles assessed (20% MVC) and displayed a pattern of constant activation to maintain shoulder abduction. Biceps and triceps were less active (4 - 12% MVC) and showed a more periodic 'on and off pattern. Novices' muscle activity was higher in all cases. Experts' muscle activity showed a consistent pattern across strings, whereas the novices were more irregular. The agonist-antagonist roles of biceps and triceps during the bowing motion were clearly defined in the expert group, but not as apparent in the novice group. Conclusions: Bowing movement appears to be controlled by the shoulder rather than the elbow as shoulder ROM changed across strings while elbow ROM remained the same. Shoulder injuries are probably due to repetition as the muscle activity required for the movement is small. Experts require a smaller amount of muscle activity to perform the movement, possibly due to more efficient muscle activation patterns as a result of practice. This quantitative multidisciplinary approach to analysing violinists' movements can contribute to fuller understanding of both playing demands and injury mechanisms .
Resumo:
Three dimensional model design is a well-known and studied field, with numerous real-world applications. However, the manual construction of these models can often be time-consuming to the average user, despite the advantages o ffered through computational advances. This thesis presents an approach to the design of 3D structures using evolutionary computation and L-systems, which involves the automated production of such designs using a strict set of fitness functions. These functions focus on the geometric properties of the models produced, as well as their quantifiable aesthetic value - a topic which has not been widely investigated with respect to 3D models. New extensions to existing aesthetic measures are discussed and implemented in the presented system in order to produce designs which are visually pleasing. The system itself facilitates the construction of models requiring minimal user initialization and no user-based feedback throughout the evolutionary cycle. The genetic programming evolved models are shown to satisfy multiple criteria, conveying a relationship between their assigned aesthetic value and their perceived aesthetic value. Exploration into the applicability and e ffectiveness of a multi-objective approach to the problem is also presented, with a focus on both performance and visual results. Although subjective, these results o er insight into future applications and study in the fi eld of computational aesthetics and automated structure design.
Characterizing Dynamic Optimization Benchmarks for the Comparison of Multi-Modal Tracking Algorithms
Resumo:
Population-based metaheuristics, such as particle swarm optimization (PSO), have been employed to solve many real-world optimization problems. Although it is of- ten sufficient to find a single solution to these problems, there does exist those cases where identifying multiple, diverse solutions can be beneficial or even required. Some of these problems are further complicated by a change in their objective function over time. This type of optimization is referred to as dynamic, multi-modal optimization. Algorithms which exploit multiple optima in a search space are identified as niching algorithms. Although numerous dynamic, niching algorithms have been developed, their performance is often measured solely on their ability to find a single, global optimum. Furthermore, the comparisons often use synthetic benchmarks whose landscape characteristics are generally limited and unknown. This thesis provides a landscape analysis of the dynamic benchmark functions commonly developed for multi-modal optimization. The benchmark analysis results reveal that the mechanisms responsible for dynamism in the current dynamic bench- marks do not significantly affect landscape features, thus suggesting a lack of representation for problems whose landscape features vary over time. This analysis is used in a comparison of current niching algorithms to identify the effects that specific landscape features have on niching performance. Two performance metrics are proposed to measure both the scalability and accuracy of the niching algorithms. The algorithm comparison results demonstrate the algorithms best suited for a variety of dynamic environments. This comparison also examines each of the algorithms in terms of their niching behaviours and analyzing the range and trade-off between scalability and accuracy when tuning the algorithms respective parameters. These results contribute to the understanding of current niching techniques as well as the problem features that ultimately dictate their success.
Resumo:
In the scope of the current thesis we review and analyse networks that are formed by nodes with several attributes. We suppose that different layers of communities are embedded in such networks, besides each of the layers is connected with nodes' attributes. For example, examine one of a variety of online social networks: an user participates in a plurality of different groups/communities – schoolfellows, colleagues, clients, etc. We introduce a detection algorithm for the above-mentioned communities. Normally the result of the detection is the community supplemented just by the most dominant attribute, disregarding others. We propose an algorithm that bypasses dominant communities and detects communities which are formed by other nodes' attributes. We also review formation models of the attributed networks and present a Human Communication Network (HCN) model. We introduce a High School Texting Network (HSTN) and examine our methods for that network.
Resumo:
In spatial environments, we consider social welfare functions satisfying Arrow's requirements. i.e., weak Pareto and independence of irrelevant alternatives. When the policy space os a one-dimensional continuum, such a welfare function is determined by a collection of 2n strictly quasi-concave preferences and a tie-breaking rule. As a corrollary, we obtain that when the number of voters is odd, simple majority voting is transitive if and only if each voter's preference is strictly quasi-concave. When the policy space is multi-dimensional, we establish Arrow's impossibility theorem. Among others, we show that weak Pareto, independence of irrelevant alternatives, and non-dictatorship are inconsistent if the set of alternatives has a non-empty interior and it is compact and convex.