169 resultados para Labelled graphs


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a novel approach for preprocessing systems of polynomial equations via graph partitioning. The variable-sharing graph of a system of polynomial equations is defined. If such graph is disconnected, then the corresponding system of equations can be split into smaller ones that can be solved individually. This can provide a tremendous speed-up in computing the solution to the system, but is unlikely to occur either randomly or in applications. However, by deleting certain vertices on the graph, the variable-sharing graph could be disconnected in a balanced fashion, and in turn the system of polynomial equations would be separated into smaller systems of near-equal sizes. In graph theory terms, this process is equivalent to finding balanced vertex partitions with minimum-weight vertex separators. The techniques of finding these vertex partitions are discussed, and experiments are performed to evaluate its practicality for general graphs and systems of polynomial equations. Applications of this approach in algebraic cryptanalysis on symmetric ciphers are presented: For the QUAD family of stream ciphers, we show how a malicious party can manufacture conforming systems that can be easily broken. For the stream ciphers Bivium and Trivium, we nachieve significant speedups in algebraic attacks against them, mainly in a partial key guess scenario. In each of these cases, the systems of polynomial equations involved are well-suited to our graph partitioning method. These results may open a new avenue for evaluating the security of symmetric ciphers against algebraic attacks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an environment where it has become increasingly difficult to attract consumer attention, marketers have begun to explore alternative forms of marketing communication. One such form that has emerged is product placement, which has more recently appeared in electronic games. Given changes in media consumption and the growth of the games industry, it is not surprising that games are being exploited as a medium for promotional content. Other market developments are also facilitating and encouraging their use, in terms of both the insertion of brand messages into video games and the creation of brand-centred environments, labelled ‘advergames’. However, while there is much speculation concerning the beneficial outcomes for marketers, there remains a lack of academic work in this area and little empirical evidence of the actual effects of this form of promotion on game players. Only a handful of studies are evident in the literature, which have explored the influence of game placements on consumers. The majority have studied their effect on brand awareness, largely demonstrating that players can recall placed brands. Further, most research conducted to date has focused on computer and online games, but consoles represent the dominant platform for play (Taub, 2004). Finally, advergames have largely been neglected, particularly those in a console format. Widening the gap in the literature is the fact that insufficient academic attention has been given to product placement as a marketing communication strategy overall, and to games in general. The unique nature of the strategy also makes it difficult to apply existing literature to this context. To address a significant need for information in both the academic and business domains, the current research investigates the effects of brand and product placements in video games and advergames on consumer attitude to the brand and corporate image. It was conducted in two stages. Stage one represents a pilot study. It explored the effects of use simulated and peripheral placements in video games on players’ and observers’ attitudinal responses, and whether these are influenced by involvement with a product category or skill level in the game. The ability of gamers to recall placed brands was also examined. A laboratory experiment was employed with a small sample of sixty adult subjects drawn from an Australian east-coast university, some of who were exposed to a console video game on a television set. The major finding of study one is that placements in a video game have no effect on gamers’ attitudes, but they are recalled. For stage two of the research, a field experiment was conducted with a large, random sample of 350 student respondents to investigate the effects on players of brand and product placements in handheld video games and advergames. The constructs of brand attitude and corporate image were again tested, along with several potential confounds. Consistent with the pilot, the results demonstrate that product placement in electronic games has no effect on players’ brand attitudes or corporate image, even when allowing for their involvement with the product category, skill level in the game, or skill level in relation to the medium. Age and gender also have no impact. However, the more interactive a player perceives the game to be, the higher their attitude to the placed brand and corporate image of the brand manufacturer. In other words, when controlling for perceived interactivity, players experienced more favourable attitudes, but the effect was so weak it probably lacks practical significance. It is suggested that this result can be explained by the existence of excitation transfer, rather than any processing of placed brands. The current research provides strong, empirical evidence that brand and product placements in games do not produce strong attitudinal responses. It appears that the nature of the game medium, game playing experience and product placement impose constraints on gamer motivation, opportunity and ability to process these messages, thereby precluding their impact on attitude to the brand and corporate image. Since this is the first study to investigate the ability of video game and advergame placements to facilitate these deeper consumer responses, further research across different contexts is warranted. Nevertheless, the findings have important theoretical and managerial implications. This investigation makes a number of valuable contributions. First, it is relevant to current marketing practice and presents findings that can help guide promotional strategy decisions. It also presents a comprehensive review of the games industry and associated activities in the marketplace, relevant for marketing practitioners. Theoretically, it contributes new knowledge concerning product placement, including how it should be defined, its classification within the existing communications framework, its dimensions and effects. This is extended to include brand-centred entertainment. The thesis also presents the most comprehensive analysis available in the literature of how placements appear in games. In the consumer behaviour discipline, the research builds on theory concerning attitude formation, through application of MacInnis and Jaworski’s (1989) Integrative Attitude Formation Model. With regards to the games literature, the thesis provides a structured framework for the comparison of games with different media types; it advances understanding of the game medium, its characteristics and the game playing experience; and provides insight into console and handheld games specifically, as well as interactive environments generally. This study is the first to test the effects of interactivity in a game environment, and presents a modified scale that can be used as part of future research. Methodologically, it addresses the limitations of prior research through execution of a field experiment and observation with a large sample, making this the largest study of product placement in games available in the literature. Finally, the current thesis offers comprehensive recommendations that will provide structure and direction for future study in this important field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nature Refuges encompass the second largest extent of protected area estate in Queensland. Major problems exist in the data capture, map presentation, data quality and integrity of these boundaries. The spatial accuracies/inaccuracies of the Nature Refuge administrative boundaries directly influence the ability to preserve valuable ecosystems by challenging negative environmental impacts on these properties. This research work is about supporting the Nature Refuge Programs efforts to secure Queensland’s natural and cultural values on private land by utilising GIS and its advanced functionalities. The research design organizes and enters Queensland’s Nature Refuge boundaries into a spatial environment. Survey quality data collection techniques such as the Global Positioning Systems (GPS) are investigated to capture Nature Refuge boundary information. Using the concepts of map communication GIS Cartography is utilised for the protected area plan design. New spatial datasets are generated facilitating the effectiveness of investigative data analysis. The geodatabase model developed by this study adds rich GIS behaviour providing the capability to store, query, and manipulate geographic information. It provides the ability to leverage data relationships and enforces topological integrity creating savings in customization and productivity. The final phase of the research design incorporates the advanced functions of ArcGIS. These functions facilitate building spatial system models. The geodatabase and process models developed by this research can be easily modified and the data relating to mining can be replaced by other negative environmental impacts affecting the Nature Refuges. Results of the research are presented as graphs and maps providing visual evidence supporting the usefulness of GIS as means for capturing, visualising and enhancing spatial quality and integrity of Nature Refuge boundaries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Where object-oriented languages deal with objects as described by classes, model-driven development uses models, as graphs of interconnected objects, described by metamodels. A number of new languages have been and continue to be developed for this model- based paradigm, both for model transformation and for general programming using models. Many of these use single-object approaches to typing, derived from solutions found in object-oriented systems, while others use metamodels as model types, but without a clear notion of polymorphism. Both of these approaches lead to brittle and overly restrictive reuse characteristics. In this paper we propose a simple extension to object-oriented typing to better cater for a model-oriented context, including a simple strategy for typing models as a collection of interconnected objects. We suggest extensions to existing type system formalisms to support these concepts and their manipulation. Using a simple example we show how this extended approach permits more flexible reuse, while preserving type safety.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Civic participation of young people around the world is routinely described in deficit terms, as they are labelled apathetic, devoid of political knowledge, disengaged from the community and self-absorbed (Andolina, 2002; Weller, 2006). This paper argues that the connectivity of time, space and social values (Lefebvre, 1991; Soja, 1996) are integral to understanding the performances of young people as civic subjects. Today’s youth negotiate unstable social, economic and environmental conditions, new technologies and new forms of community. Loyalty, citizenship and notions of belonging take on new meanings in these changing global conditions. Using the socio-spatial theories of Lefebvre and Foucault, and the tools of critical discourse analysis, this paper argues that the chronotope, or time/space relationship of universities, produces student citizens who, in resistance to a complex global society, create a cocooned space which focuses on moral and spiritual values that can be enacted on a personal level.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the expected safety performance of rural signalized intersections is critical for (a) identifying high-risk sites where the observed safety performance is substantially worse than the expected safety performance, (b) understanding influential factors associated with crashes, and (c) predicting the future performance of sites and helping plan safety-enhancing activities. These three critical activities are routinely conducted for safety management and planning purposes in jurisdictions throughout the United States and around the world. This paper aims to develop baseline expected safety performance functions of rural signalized intersections in South Korea, which to date have not yet been established or reported in the literature. Data are examined from numerous locations within South Korea for both three-legged and four-legged configurations. The safety effects of a host of operational and geometric variables on the safety performance of these sites are also examined. In addition, supplementary tables and graphs are developed for comparing the baseline safety performance of sites with various geometric and operational features. These graphs identify how various factors are associated with safety. The expected safety prediction tables offer advantages over regression prediction equations by allowing the safety manager to isolate specific features of the intersections and examine their impact on expected safety. The examination of the expected safety performance tables through illustrated examples highlights the need to correct for regression-to-the-mean effects, emphasizes the negative impacts of multicollinearity, shows why multivariate models do not translate well to accident modification factors, and illuminates the need to examine road safety carefully and methodically. Caveats are provided on the use of the safety performance prediction graphs developed in this paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of what has been broadly labelled the knowledge economy has been such that, even in the absence of precise measurement, it is the undoubted dynamo of today’s global market, and an essential part of any global city. The socio-economic importance of knowledge production in a knowledge economy is clear, and it is an emerging social phenomenon and research agenda in geographical studies. Knowledge production, and where, how and by whom it is produced, is an urban phenomenon that is poorly understood in an era of strong urbanisation. This paper focuses on knowledge community precincts as the catalytic magnet infrastructures impacting on knowledge production in cities. The paper discusses the increasing importance of knowledge-based urban development within the paradigm of the knowledge economy, and the role of knowledge community precincts as instruments to seed the foundation of knowledge production in cities. This paper explores the knowledge based urban development, and particularly knowledge community precinct development, potentials of Sydney, Melbourne and Brisbane, and benchmarks this against that of Boston, Massachusetts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is a need for educational frameworks for computer ethics education. This discussion paper presents an approach to developing students’ moral sensitivity, an awareness of morally relevant issues, in project-based learning (PjBL). The proposed approach is based on a study of IT professionals’ levels of awareness of ethics. These levels are labelled My world, The corporate world, A shared world, The client’s world and The wider world. We give recommendations for how instructors may stimulate students’ thinking with the levels and how the levels may be taken into account in managing a project course and in an IS department. Limitations of the recommendations are assessed and issues for discussion are raised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Impedance cardiography is an application of bioimpedance analysis primarily used in a research setting to determine cardiac output. It is a non invasive technique that measures the change in the impedance of the thorax which is attributed to the ejection of a volume of blood from the heart. The cardiac output is calculated from the measured impedance using the parallel conductor theory and a constant value for the resistivity of blood. However, the resistivity of blood has been shown to be velocity dependent due to changes in the orientation of red blood cells induced by changing shear forces during flow. The overall goal of this thesis was to study the effect that flow deviations have on the electrical impedance of blood, both experimentally and theoretically, and to apply the results to a clinical setting. The resistivity of stationary blood is isotropic as the red blood cells are randomly orientated due to Brownian motion. In the case of blood flowing through rigid tubes, the resistivity is anisotropic due to the biconcave discoidal shape and orientation of the cells. The generation of shear forces across the width of the tube during flow causes the cells to align with the minimal cross sectional area facing the direction of flow. This is in order to minimise the shear stress experienced by the cells. This in turn results in a larger cross sectional area of plasma and a reduction in the resistivity of the blood as the flow increases. Understanding the contribution of this effect on the thoracic impedance change is a vital step in achieving clinical acceptance of impedance cardiography. Published literature investigates the resistivity variations for constant blood flow. In this case, the shear forces are constant and the impedance remains constant during flow at a magnitude which is less than that for stationary blood. The research presented in this thesis, however, investigates the variations in resistivity of blood during pulsataile flow through rigid tubes and the relationship between impedance, velocity and acceleration. Using rigid tubes isolates the impedance change to variations associated with changes in cell orientation only. The implications of red blood cell orientation changes for clinical impedance cardiography were also explored. This was achieved through measurement and analysis of the experimental impedance of pulsatile blood flowing through rigid tubes in a mock circulatory system. A novel theoretical model including cell orientation dynamics was developed for the impedance of pulsatile blood through rigid tubes. The impedance of flowing blood was theoretically calculated using analytical methods for flow through straight tubes and the numerical Lattice Boltzmann method for flow through complex geometries such as aortic valve stenosis. The result of the analytical theoretical model was compared to the experimental impedance measurements through rigid tubes. The impedance calculated for flow through a stenosis using the Lattice Boltzmann method provides results for comparison with impedance cardiography measurements collected as part of a pilot clinical trial to assess the suitability of using bioimpedance techniques to assess the presence of aortic stenosis. The experimental and theoretical impedance of blood was shown to inversely follow the blood velocity during pulsatile flow with a correlation of -0.72 and -0.74 respectively. The results for both the experimental and theoretical investigations demonstrate that the acceleration of the blood is an important factor in determining the impedance, in addition to the velocity. During acceleration, the relationship between impedance and velocity is linear (r2 = 0.98, experimental and r2 = 0.94, theoretical). The relationship between the impedance and velocity during the deceleration phase is characterised by a time decay constant, ô , ranging from 10 to 50 s. The high level of agreement between the experimental and theoretically modelled impedance demonstrates the accuracy of the model developed here. An increase in the haematocrit of the blood resulted in an increase in the magnitude of the impedance change due to changes in the orientation of red blood cells. The time decay constant was shown to decrease linearly with the haematocrit for both experimental and theoretical results, although the slope of this decrease was larger in the experimental case. The radius of the tube influences the experimental and theoretical impedance given the same velocity of flow. However, when the velocity was divided by the radius of the tube (labelled the reduced average velocity) the impedance response was the same for two experimental tubes with equivalent reduced average velocity but with different radii. The temperature of the blood was also shown to affect the impedance with the impedance decreasing as the temperature increased. These results are the first published for the impedance of pulsatile blood. The experimental impedance change measured orthogonal to the direction of flow is in the opposite direction to that measured in the direction of flow. These results indicate that the impedance of blood flowing through rigid cylindrical tubes is axisymmetric along the radius. This has not previously been verified experimentally. Time frequency analysis of the experimental results demonstrated that the measured impedance contains the same frequency components occuring at the same time point in the cycle as the velocity signal contains. This suggests that the impedance contains many of the fluctuations of the velocity signal. Application of a theoretical steady flow model to pulsatile flow presented here has verified that the steady flow model is not adequate in calculating the impedance of pulsatile blood flow. The success of the new theoretical model over the steady flow model demonstrates that the velocity profile is important in determining the impedance of pulsatile blood. The clinical application of the impedance of blood flow through a stenosis was theoretically modelled using the Lattice Boltzman method (LBM) for fluid flow through complex geometeries. The impedance of blood exiting a narrow orifice was calculated for varying degrees of stenosis. Clincial impedance cardiography measurements were also recorded for both aortic valvular stenosis patients (n = 4) and control subjects (n = 4) with structurally normal hearts. This pilot trial was used to corroborate the results of the LBM. Results from both investigations showed that the decay time constant for impedance has potential in the assessment of aortic valve stenosis. In the theoretically modelled case (LBM results), the decay time constant increased with an increase in the degree of stenosis. The clinical results also showed a statistically significant difference in time decay constant between control and test subjects (P = 0.03). The time decay constant calculated for test subjects (ô = 180 - 250 s) is consistently larger than that determined for control subjects (ô = 50 - 130 s). This difference is thought to be due to difference in the orientation response of the cells as blood flows through the stenosis. Such a non-invasive technique using the time decay constant for screening of aortic stenosis provides additional information to that currently given by impedance cardiography techniques and improves the value of the device to practitioners. However, the results still need to be verified in a larger study. While impedance cardiography has not been widely adopted clinically, it is research such as this that will enable future acceptance of the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With regard to the long-standing problem of the semantic gap between low-level image features and high-level human knowledge, the image retrieval community has recently shifted its emphasis from low-level features analysis to high-level image semantics extrac- tion. User studies reveal that users tend to seek information using high-level semantics. Therefore, image semantics extraction is of great importance to content-based image retrieval because it allows the users to freely express what images they want. Semantic content annotation is the basis for semantic content retrieval. The aim of image anno- tation is to automatically obtain keywords that can be used to represent the content of images. The major research challenges in image semantic annotation are: what is the basic unit of semantic representation? how can the semantic unit be linked to high-level image knowledge? how can the contextual information be stored and utilized for image annotation? In this thesis, the Semantic Web technology (i.e. ontology) is introduced to the image semantic annotation problem. Semantic Web, the next generation web, aims at mak- ing the content of whatever type of media not only understandable to humans but also to machines. Due to the large amounts of multimedia data prevalent on the Web, re- searchers and industries are beginning to pay more attention to the Multimedia Semantic Web. The Semantic Web technology provides a new opportunity for multimedia-based applications, but the research in this area is still in its infancy. Whether ontology can be used to improve image annotation and how to best use ontology in semantic repre- sentation and extraction is still a worth-while investigation. This thesis deals with the problem of image semantic annotation using ontology and machine learning techniques in four phases as below. 1) Salient object extraction. A salient object servers as the basic unit in image semantic extraction as it captures the common visual property of the objects. Image segmen- tation is often used as the �rst step for detecting salient objects, but most segmenta- tion algorithms often fail to generate meaningful regions due to over-segmentation and under-segmentation. We develop a new salient object detection algorithm by combining multiple homogeneity criteria in a region merging framework. 2) Ontology construction. Since real-world objects tend to exist in a context within their environment, contextual information has been increasingly used for improving object recognition. In the ontology construction phase, visual-contextual ontologies are built from a large set of fully segmented and annotated images. The ontologies are composed of several types of concepts (i.e. mid-level and high-level concepts), and domain contextual knowledge. The visual-contextual ontologies stand as a user-friendly interface between low-level features and high-level concepts. 3) Image objects annotation. In this phase, each object is labelled with a mid-level concept in ontologies. First, a set of candidate labels are obtained by training Support Vectors Machines with features extracted from salient objects. After that, contextual knowledge contained in ontologies is used to obtain the �nal labels by removing the ambiguity concepts. 4) Scene semantic annotation. The scene semantic extraction phase is to get the scene type by using both mid-level concepts and domain contextual knowledge in ontologies. Domain contextual knowledge is used to create scene con�guration that describes which objects co-exist with which scene type more frequently. The scene con�guration is represented in a probabilistic graph model, and probabilistic inference is employed to calculate the scene type given an annotated image. To evaluate the proposed methods, a series of experiments have been conducted in a large set of fully annotated outdoor scene images. These include a subset of the Corel database, a subset of the LabelMe dataset, the evaluation dataset of localized semantics in images, the spatial context evaluation dataset, and the segmented and annotated IAPR TC-12 benchmark.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australian beach is now accepted as a significant part of Australian national culture and identity. However, Huntsman (2001) and Booth (2001) both believe that the beach is dying: “intellectuals have failed to apply to the beach the attention they have lavished on the bush…” (Huntsman 2001, 218). Yet the beach remains a prominent image in contemporary literature and film; authors such as Tim Winton and Robert Drewe frequently set their stories in and around the coast. Although initially considered a space of myth (Fiske, Hodge, and Turner 1987), Meaghan Morris labelled the beach as ‘ordinary’ (1998), and as recently as 2001 in the wake of the Sydney Olympic Games, Bonner, McKee, and Mackay termed the beach ‘tacky’ and ‘familiar’. The beach, it appears, defies an easy categorisation. In fact, I believe the beach is more than merely mythic or ordinary, or a combination of the two. Instead it is an imaginative space, seamlessly shifting its metaphorical meanings dependent on readings of the texts. My studies examine the beach through five common beach myths; this paper will explore the myth of the beach as an egalitarian space. Contemporary Australian national texts no longer conform to these mythical representations – (in fact, was the beach ever a space of equality?), instead creating new definitions for the beach space that continually shifts in meaning. Recent texts such as Tim Winton’s Breath (2008) and Stephen Orr’s Time’s Long Ruin (2010) lay a more complex metaphorical meaning upon the beach space. This paper will explore the beach as a space of egalitarianism in conjunction with recent Australian fiction and films in order to discover how the contemporary beach is represented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past two decades there has been a remarkable expansion in the use of executive coaching as an executive development technique. The increasing prominence of executive coaching has been attributed to the emergence of new organisational cultures and the subtler competencies needed by executives in these faster moving organisations. The widespread popularity of executive coaching has been based largely on anecdotal feedback regarding its effectiveness. The small body of empirical research has been growing but conclusive outcomes are rare. The prominent question for those with the business imperative to implement executive coaching has been what are the ingredients of the process that engender an effective outcome? This investigation has focused on the factors of executive coaching that contribute to effectiveness. A qualitative methodology facilitated an in-depth study of the experiences of the participants of executive coaching with the perceptions of both executives and coaches being sought. Semi-structured interviews and a focus group provided rich, thick descriptions and together with a process of inductive analysis produced findings that confidently identify the key factors that contribute to coaching effectiveness. Six major themes were identified, each comprising a collection of meanings. These themes have been labelled Executive Engagement, Preliminary Assessment and Feedback, Coaching Process, Coach.s Contribution, Trusting Relationship and Support from the Organisation. One theme, Coaching Process, comprises three significant sub-themes, namely, Encouragement and Emotional Support, Challenge and Reflection and Enhancing Executive Performance. The findings of this study add value to the field by identifying factors contributing to coaching effectiveness, and providing for the coaching practitioner a basis for enhancing their practice of executive coaching to better meet the needs of executives and their organisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is likely that effective application of cell-laden implants for cartilage defects depends on retention of implanted cells and interaction between implanted and host cells. The objectives of this study were to characterize stratified cartilaginous constructs seeded sequentially with superficial (S) and middle (M) chondrocyte subpopulations labelled with fluorescent cell tracking dye PKH26 (*) and determine the degree to which these stratified cartilaginous constructs maintain their architecture in vivo after implantation in mini-pigs for 1 week. Alginate-recovered cells were seeded sequentially to form stratified S*/M (only S cells labelled) and S*/M* (both S and M cells labelled) constructs. Full-thickness defects (4 mm diameter) were created in the patellofemoral groove of adult Yucatan mini-pigs and filled with portions of constructs or left empty. Constructs were characterized biochemically, histologically, and biomechanically, and stratification visualized and quantified, before and after implant. After 1 week, animals were sacrificed and implants retrieved. After 1 week in vivo, glycosaminoglycan and collagen content of constructs remained similar to that at implant, whereas DNA content increased. Histological analyses revealed features of an early repair response, with defects filled with tissues containing little matrix and abundant cells. Some implanted (PKH26-labeled) cells persisted in the defects, although constructs did not maintain a stratified organization. Of the labelled cells, 126 +/- 38% and 32 +/- 8% in S*/M and S*/M* constructs, respectively, were recovered. Distribution of labelled cells indicated interactions between implanted and host cells. Longer-term in vivo studies will be useful in determining whether implanted cells are sufficient to have a positive effect in repair.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Porous yttria-stabilized zirconia (YSZ) has been regarded as a potential candidate for bone substitute as its high mechanical strength. However, porous YSZ bodies are biologically inert to bone tissue. It is therefore necessary to introduce bioactive coatings onto the walls of the porous structures to enhance the bioactivity. In this study, the porous zirconia scaffolds were prepared by infiltration of Acrylonitrile Butadiene Styrene (ABS) scaffolds with 3 mol% yttria stabilized zirconia slurry. After sintering, a method of sol-gel dip coating was involved to make coating layer of mesoporous bioglass (MBGs). The porous zirconia without the coating had high porosities of 60.1% to 63.8%, and most macropores were interconnected with pore sizes of 0.5-0.8mm. The porous zirconia had compressive strengths of 9.07-9.90MPa. Moreover, the average coating thickness was about 7μm. There is no significant change of compressive strength for the porous zirconia with mesoporous biogalss coating. The bone marrow stromal cell (BMSC) proliferation test showed both uncoated and coated zirconia scaffolds have good biocompatibility. The scanning electron microscope (SEM) micrographs and the compositional analysis graphs demonstrated that after testing in the simulated body fluid (SBF) for 7 days, the apatite formation occurred on the coating surface. Thus, porous zirconia-based ceramics were modified with bioactive coating of mesoporous bioglass for potential biomedical applications.