955 resultados para Quantified Autoepistemic Logic


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This chapter explores the development of concepts of interactive environments by comparing two major projects that frame the period of this book. The Fun Palace of 1960 and the Generator of 1980 both proposed interactive environments responsive to the needs and behaviour of their users, but the contrast in terms of the available technology and what it enabled could not be more marked. The Fun Palace broke new architectural, organizational and social ground and was arguably the first proposition for cybernetic architecture; the Generator demonstrated how it could be achieved. Both projects are now acknowledged as seminal architectural propositions of the twentieth century, and both were designed by Cedric Price.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Facing with the difficulty in information propagation and synthesizing from conceptual to embodiment design, this paper introduces a function-oriented, axiom based conceptual modeling scheme. Default logic reasoning is exploited for recognition and reconstitution of conceptual product geometric and topological information. The proposed product modeling system and reasoning approach testify a methodology of "structural variation design", which is verified in the implementation of a GPAL (Green Product All Life-cycle) CAD system. The GPAL system includes major enhancement modules of a mechanism layout sketching method based on fuzzy logic, a knowledge-based function-to-form mapping mechanism and conceptual form reconstitution paradigm based on default geometric reasoning. A mechanical hand design example shows a more than 20 times increase in design efficacy with these enhancement modules in the GPAL system on a general 3D CAD platform.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Providing support for reversible transformations as a basis for round-trip engineering is a significant challenge in model transformation research. While there are a number of current approaches, they require the underlying transformation to exhibit an injective behaviour when reversing changes. This however, does not serve all practical transformations well. In this paper, we present a novel approach to round-trip engineering that does not place restrictions on the nature of the underlying transformation. Based on abductive logic programming, it allows us to compute a set of legitimate source changes that equate to a given change to the target model. Encouraging results are derived from an initial prototype that supports most concepts of the Tefkat transformation language

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: Nursing clinicians are primarily responsible for the monitoring and treatment of increased body temperature. The body temperature of patients during their acute care hospital stay is measured at regular repeated intervals. In the event a patient is assessed with an elevated temperature, a multitude of decisions are required. The action of instigating temperature reducing strategies is based upon the assumption that elevated temperature is harmful and that the strategy employed will have some beneficial effect. Background and Significance: The potential harmful effects of increased body temperature (fever, hyperthermia) following neurological insult are well recognised. Although few studies have investigated this phenomenon in the diagnostic population of non-traumatic subarachnoid haemorrhage, it has been demonstrated that increased body temperature occurs in 41 to 72% of patients with poor clinical outcome. However, in the Australian context the frequency, or other characteristics of increased body temperature, as well as the association between increased body temperature with poor clinical outcome has not been established. Design: This study used a correlational study design to: describe the frequency, duration and timing of increased body temperature; determine the association between increased body temperature and clinical outcome; and describe the clinical interventions used to manage increased body temperature in patients with non-traumatic subarachnoid haemorrhage. A retrospective clinical chart audit was conducted on 43 patients who met the inclusion criteria. Findings: The major findings of this study were: increased body temperature occurred frequently; persisted for a long time; and onset did not occur until 20 hours after primary insult; increased body temperature was associated with death or dependent outcome; and no intervention was recorded in many instances. Conclusion: This study has quantified in a non-traumatic subarachnoid haemorrhage patient population the characteristics of increased body temperature, established an association between increased body temperature with death or dependent outcome and described the current management of elevated temperatures in the Australian context to improve nursing practice, education and research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Risks and uncertainties are inevitable in engineering projects and infrastructure investments. Decisions about investment in infrastructure such as for maintenance, rehabilitation and construction works can pose risks, and may generate significant impacts on social, cultural, environmental and other related issues. This report presents the results of a literature review of current practice in identifying, quantifying and managing risks and predicting impacts as part of the planning and assessment process for infrastructure investment proposals. In assessing proposals for investment in infrastructure, it is necessary to consider social, cultural and environmental risks and impacts to the overall community, as well as financial risks to the investor. The report defines and explains the concept of risk and uncertainty, and describes the three main methodology approaches to the analysis of risk and uncertainty in investment planning for infrastructure, viz examining a range of scenarios or options, sensitivity analysis, and a statistical probability approach, listed here in order of increasing merit and complexity. Forecasts of costs, benefits and community impacts of infrastructure are recognised as central aspects of developing and assessing investment proposals. Increasingly complex modelling techniques are being used for investment evaluation. The literature review identified forecasting errors as the major cause of risk. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. The report contains a summary of the broad nature of decision-making tools used by governments and other organisations in Australia, New Zealand, Europe and North America, and shows their overall approach to risk assessment in assessing public infrastructure proposals. While there are established techniques to quantify financial and economic risks, quantification is far less developed for political, social and environmental risks and impacts. For risks that cannot be readily quantified, assessment techniques commonly include classification or rating systems for likelihood and consequence. The report outlines the system used by the Australian Defence Organisation and in the Australian Standard on risk management. After each risk is identified and quantified or rated, consideration can be given to reducing the risk, and managing any remaining risk as part of the scope of the project. The literature review identified use of risk mapping techniques by a North American chemical company and by the Australian Defence Organisation. This literature review has enabled a risk assessment strategy to be developed, and will underpin an examination of the feasibility of developing a risk assessment capability using a probability approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Entrepreneurial marketing has gained popularity in both the entrepreneurship and marketing disciplines in recent times. The success of ventures that have pursued what are considered non-traditional marketing approaches has been attributed to entrepreneurial marketing practices. Despite the multitude of marketing concepts and models, there are prominent venture successes that do not conform to these and have thus been put in the ''entrepreneurial'' box. One only has to look to the ''Virgin'' model to put this in context. Branson has proven for example that not ''sticking to the knitting'' can work with the ways the Virgin portfolio has been diversified. Consequently, an entrepreneurial orientation is considered a desirable philosophy and has become prominent in such industries as airlines and information technology. Miles and Arnold (1991) found that entrepreneurial orientation is positively correlated to marketing orientation. They propose that entrepreneurial orientation is a strategic response by firms to turbulence in the environment. While many marketing successes are analysed in hindsight using traditional marketing concepts and strategies, there are those that challenge standard marketing textbook recommendations. Marketing strategy is often viewed as a process of targeting, segmenting and positioning (STP). Academics and consultants advocate this approach along with the marketing and business plans. The reality however is that a number of businesses do not practice these and pursue alternative approaches. Other schools of thought and business models have been developing to explain differences in orientation such as branding (Keller 2001), the service-dominant logic (Vargo and Lusch 2004) and effectuation logic (Sarasvathy 2001). This indicates that scholars are now looking to cognate fields to explain a given phenomenon beyond their own disciplines. Bucking this trend is a growing number of researchers working at the interface between entrepreneurship and marketing. There is now an emerging body of work dedicated to this interface, hence the development of entrepreneurial marketing as an alternative to the traditional approaches. Hills and Hultman (2008:3) define entrepreneurial marketing as ''a spirit, an orientation as well as a process of passionately pursuing opportunities and launching and growing ventures that create perceived customer value through relationships by employing innovativeness, creativity, selling, market immersion, networking and flexibility.'' Although it started as a special interest group, entrepreneurial marketing is now gaining recognition in mainstream entrepreneurship and marketing literature. For example new marketing textbooks now incorporate an entrepreneurial marketing focus (Grewal and Levy 2008). The purpose of this paper is to explore what entrepreneurial approaches are used by entrepreneurs and their impact on the success of marketing activities. Methodology/Key Propositions In order to investigate this, we employ two cases: 42Below, vodka producers from New Zealand and Penderyn Distillery, whisky distillers from Wales. The cases were chosen based on the following criteria. Firstly, both companies originate from small economies. Secondly, both make products (spirits) from locations that are not traditionally regarded as producers of their flagship products and thirdly, the two companies are different from each other in terms of their age. Penderyn is an old company established in 1882, whereas 42Below was founded only in 1999. Vodka has never been associated with New Zealand. By the same token, whisky has always been associated with Scotland and Ireland but never been with Wales. Both companies defied traditional stereotypes in marketing their flagship products and found international success. Using a comparative a case study approach, we use Covin and Slevin's (1989) set of items that purport to measure entrepreneurial orientation and apply a qualitative lens on the approaches of both companies. These are: 1. cultural emphases on innovation and R&D 2. high rate of new product introduction 3. bold, innovative product development 4. initiator proactive posture 5. first to introduce new technologies and products 6. competitive posture toward competitor 7. strong prolictivity for high risk, high return projects 8. environment requires boldness to achieve objectives 9. when faced with risk, adopts aggressive, bold posture. Results and Implications We find that both companies have employed entrepreneurial marketing approaches but with different intensities. While acknowledging that they are different from the norm, the specifics of their individual approaches are dissimilar. Both companies have positioned their products at the premium end of their product categories and have emphasised quality and awards in their communication strategies. 42Below has carved an image of irreverence and being non-conformist. They have unashamedly utilised viral marketing and entered international markets by training bartenders and hosting unconventional events. They use edgy language such as vodka university, vodka professors and vodka ambassadors. Penderyn Distillery has taken a more traditional approach to marketing its products and portraying romantic images of age-old tradition of distilling as key to their positioning. Both companies enjoy success as evidenced by industry awards and international acclaim.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reliable budget/cost estimates for road maintenance and rehabilitation are subjected to uncertainties and variability in road asset condition and characteristics of road users. The CRC CI research project 2003-029-C ‘Maintenance Cost Prediction for Road’ developed a method for assessing variation and reliability in budget/cost estimates for road maintenance and rehabilitation. The method is based on probability-based reliable theory and statistical method. The next stage of the current project is to apply the developed method to predict maintenance/rehabilitation budgets/costs of large networks for strategic investment. The first task is to assess the variability of road data. This report presents initial results of the analysis in assessing the variability of road data. A case study of the analysis for dry non reactive soil is presented to demonstrate the concept in analysing the variability of road data for large road networks. In assessing the variability of road data, large road networks were categorised into categories with common characteristics according to soil and climatic conditions, pavement conditions, pavement types, surface types and annual average daily traffic. The probability distributions, statistical means, and standard deviation values of asset conditions and annual average daily traffic for each type were quantified. The probability distributions and the statistical information obtained in this analysis will be used to asset the variation and reliability in budget/cost estimates in later stage. Generally, we usually used mean values of asset data of each category as input values for investment analysis. The variability of asset data in each category is not taken into account. This analysis method demonstrated that it can be used for practical application taking into account the variability of road data in analysing large road networks for maintenance/rehabilitation investment analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motor vehicles are major emitters of gaseous and particulate pollution in urban areas, and exposure to particulate pollution can have serious health effects, ranging from respiratory and cardiovascular disease to mortality. Motor vehicle tailpipe particle emissions span a broad size range from 0.003-10µm, and are measured as different subsets of particle mass concentrations or particle number count. However, no comprehensive inventories currently exist in the international published literature covering this wide size range. This paper presents the first published comprehensive inventory of motor vehicle tailpipe particle emissions covering the full size range of particles emitted. The inventory was developed for urban South-East Queensland by combining two techniques from distinctly different disciplines, from aerosol science and transport modelling. A comprehensive set of particle emission factors were combined with traffic modelling, and tailpipe particle emissions were quantified for particle number (ultrafine particles), PM1, PM2.5 and PM10 for light and heavy duty vehicles and buses. A second aim of the paper involved using the data derived in this inventory for scenario analyses, to model the particle emission implications of different proportions of passengers travelling in light duty vehicles and buses in the study region, and to derive an estimate of fleet particle emissions in 2026. It was found that heavy duty vehicles (HDVs) in the study region were major emitters of particulate matter pollution, and although they contributed only around 6% of total regional vehicle kilometres travelled, they contributed more than 50% of the region’s particle number (ultrafine particles) and PM1 emissions. With the freight task in the region predicted to double over the next 20 years, this suggests that HDVs need to be a major focus of mitigation efforts. HDVs dominated particle number (ultrafine particles) and PM1 emissions; and LDV PM2.5 and PM10 emissions. Buses contributed approximately 1-2% of regional particle emissions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is currently a strong focus worldwide on the potential of large-scale Electronic Health Record (EHR) systems to cut costs and improve patient outcomes through increased efficiency. This is accomplished by aggregating medical data from isolated Electronic Medical Record databases maintained by different healthcare providers. Concerns about the privacy and reliability of Electronic Health Records are crucial to healthcare service consumers. Traditional security mechanisms are designed to satisfy confidentiality, integrity, and availability requirements, but they fail to provide a measurement tool for data reliability from a data entry perspective. In this paper, we introduce a Medical Data Reliability Assessment (MDRA) service model to assess the reliability of medical data by evaluating the trustworthiness of its sources, usually the healthcare provider which created the data and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record. The result is then expressed by manipulating health record metadata to alert medical practitioners relying on the information to possible reliability problems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technical Report to accompany Ownership for Reasoning About Parallelism. Documents type system which captures effects and the operational semantics for the language which is presented as part of the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The fracture healing process is modulated by the mechanical environment created by imposed loads and motion between the bone fragments. Contact between the fragments obviously results in a significantly different stress and strain environment to a uniform fracture gap containing only soft tissue (e.g. haematoma). The assumption of the latter in existing computational models of the healing process will hence exaggerate the inter-fragmentary strain in many clinically-relevant cases. To address this issue, we introduce the concept of a contact zone that represents a variable degree of contact between cortices by the relative proportions of bone and soft tissue present. This is introduced as an initial condition in a two-dimensional iterative finite element model of a healing tibial fracture, in which material properties are defined by the volume fractions of each tissue present. The algorithm governing the formation of cartilage and bone in the fracture callus uses fuzzy logic rules based on strain energy density resulting from axial compression. The model predicts that increasing the degree of initial bone contact reduces the amount of callus formed (periosteal callus thickness 3.1mm without contact, down to 0.5mm with 10% bone in contact zone). This is consistent with the greater effective stiffness in the contact zone and hence, a smaller inter-fragmentary strain. These results demonstrate that the contact zone strategy reasonably simulates the differences in the healing sequence resulting from the closeness of reduction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Insufficient availability of osteogenic cells limits bone regeneration through cell-based therapies. This study investigated the potential of amniotic fluid–derived stem (AFS) cells to synthesize mineralized extracellular matrix within porous medical-grade poly-e-caprolactone (mPCL) scaffolds. The AFS cells were initially differentiated in two-dimensional (2D) culture to determine appropriate osteogenic culture conditions and verify physiologic mineral production by the AFS cells. The AFS cells were then cultured on 3D mPCL scaffolds (6-mm diameter9-mm height) and analyzed for their ability to differentiate to osteoblastic cells in this environment. The amount and distribution of mineralized matrix production was quantified throughout the mPCL scaffold using nondestructive micro computed tomography (microCT) analysis and confirmed through biochemical assays. Sterile microCT scanning provided longitudinal analysis of long-term cultured mPCL constructs to determine the rate and distribution of mineral matrix within the scaffolds. The AFS cells deposited mineralized matrix throughout the mPCL scaffolds and remained viable after 15 weeks of 3D culture. The effect of predifferentiation of the AFS cells on the subsequent bone formation in vivo was determined in a rat subcutaneous model. Cells that were pre-differentiated for 28 days in vitro produced seven times more mineralized matrix when implanted subcutaneously in vivo. This study demonstrated the potential of AFS cells to produce 3D mineralized bioengineered constructs in vitro and in vivo and suggests that AFS cells may be an effective cell source for functional repair of large bone defects

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dr. Young-Ki Paik directs the Yonsei Proteome Research Center in Seoul, Korea and was elected as the President of the Human Proteome Organization (HUPO) in 2009. In the December 2009 issue of the Current Pharmacogenomics and Personalized Medicine (CPPM), Dr. Paik explains the new field of pharmacoproteomics and the approaching wave of “proteomics diagnostics” in relation to personalized medicine, HUPO’s role in advancing proteomics technology applications, the HUPO Proteomics Standards Initiative, and the future impact of proteomics on medicine, science, and society. Additionally, he comments that (1) there is a need for launching a Gene-Centric Human Proteome Project (GCHPP) through which all representative proteins encoded by the genes can be identified and quantified in a specific cell and tissue and, (2) that the innovation frameworks within the diagnostics industry hitherto borrowed from the genetics age may require reevaluation in the case of proteomics, in order to facilitate the uptake of pharmacoproteomics innovations. He stresses the importance of biological/clinical plausibility driving the evolution of biotechnologies such as proteomics,instead of an isolated singular focus on the technology per se. Dr. Paik earned his Ph.D. in biochemistry from the University of Missouri-Columbia and carried out postdoctoral work at the Gladstone Foundation Laboratories of Cardiovascular Disease, University of California at San Francisco. In 2005, his research team at Yonsei University first identified and characterized the chemical structure of C. elegans dauer pheromone (daumone) which controls the aging process of this nematode. He is interviewed by a multidisciplinary team specializing in knowledge translation, technology regulation, health systems governance, and innovation analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the dissertation is to discover the extent to which methodologies and conceptual frameworks used to understand popular culture may also be useful in the attempt to understand contemporary high culture. The dissertation addresses this question through the application of subculture theory to Brisbane’s contemporary chamber music scene, drawing on a detailed case study of the contemporary chamber ensemble Topology and its audiences. The dissertation begins by establishing the logic and necessity of applying cultural studies methodologies to contemporary high culture. This argument is supported by a discussion of the conceptual relationships between cultural studies, high culture, and popular culture, and the methodological consequences of these relationships. In Chapter 2, a brief overview of interdisciplinary approaches to music reveals the central importance of subculture theory, and a detailed survey of the history of cultural studies research into music subcultures follows. Five investigative themes are identified as being crucial to all forms of contemporary subculture theory: the symbolic; the spatial; the social; the temporal; the ideological and political. Chapters 3 and 4 present the findings of the case study as they relate to these five investigative themes of contemporary subculture theory. Chapter 5 synthesises the findings of the previous two chapters, and argues that while participation in contemporary chamber music is not as intense or pervasive as is the case with the most researched street-based youth subcultures, it is nevertheless possible to describe Brisbane’s contemporary chamber music scene as a subculture. The dissertation closes by reflecting on the ways in which the subcultural analysis of contemporary chamber music has yielded some insight into the lived practices of high culture in contemporary urban contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is part one of a three part study into the collective regulation processes of players in massive multiplayer online games (MMOG). Traditionally game playing has not been classed as problematic, however with introduction of new media technologies and new ways to play games, certain contexts have become obscure, namely the localised order of ‘playing online’ or how players manage and maintain order between each other as opposed to ‘following the rules’. Principally this paper will examine concepts of ‘virtual community’. These will be illustrated as particularly unhelpful when considering how people conduct themselves in these spaces. Thus, ‘virtual community’ will be seen as critical in implicating various online behaviours as superior to other online behaviours causing obscurity and blurring actions. This obscurity is grounded by strong associations in the virtual community as logic of practise in and of itself; behaviours that fall outside this category become common sense and as such are made invisible for investigation. This paper will draw upon the theories of Basil Bernstein and Pierre Bourdieu to produce a distinction between online behaviours and ultimately make them visible for further investigation. In doing so this paper seeks to form a basis for future research where interaction in these spaces can be identified as belonging to a certain framework to inform the design of online games and applications more effectively.