833 resultados para INTERNATIONAL CARIES DETECTION AND ASSESSMENT SYSTEM


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the development of a complete data visualisation system for large tabular databases, such as those commonly found in a business environment. A state-of-the-art 'cyberspace cell' data visualisation technique was investigated and a powerful visualisation system using it was implemented. Although allowing databases to be explored and conclusions drawn, it had several drawbacks, the majority of which were due to the three-dimensional nature of the visualisation. A novel two-dimensional generic visualisation system, known as MADEN, was then developed and implemented, based upon a 2-D matrix of 'density plots'. MADEN allows an entire high-dimensional database to be visualised in one window, while permitting close analysis in 'enlargement' windows. Selections of records can be made and examined, and dependencies between fields can be investigated in detail. MADEN was used as a tool for investigating and assessing many data processing algorithms, firstly data-reducing (clustering) methods, then dimensionality-reducing techniques. These included a new 'directed' form of principal components analysis, several novel applications of artificial neural networks, and discriminant analysis techniques which illustrated how groups within a database can be separated. To illustrate the power of the system, MADEN was used to explore customer databases from two financial institutions, resulting in a number of discoveries which would be of interest to a marketing manager. Finally, the database of results from the 1992 UK Research Assessment Exercise was analysed. Using MADEN allowed both universities and disciplines to be graphically compared, and supplied some startling revelations, including empirical evidence of the 'Oxbridge factor'.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The national systems of innovation (NIS) approach focuses on the patterns and the determinants of innovation processes from the perspective of nation-states. This paper reports on continuing work on the application of an NIS model to the development of technological capability in Turkey. Initial assessment of the literature shows that there are a number of alternative conceptualisations of NIS. An attempt by the Government to identify a NIS for Turkey shows the main actors in the system but does not pay sufficient attention to the processes of interactions between agents within the system. An operational model should be capable of representing these processes and interactions and assessing the strengths and weaknesses of the NIS. For industrialising countries, it is also necessary to incorporate learning mechanisms into the model. Further, there are different levels of innovation and capability in different sectors which the national perspective may not reflect. This paper is arranged into three sections. The first briefly explains the basics of the national innovation and learning system. Although there is no single accepted definition of NIS, alternative definitions reviewed share some common characteristics. In the second section, an NIS model is applied to Turkey in order to identify the elements, which characterise the country’s NIS. This section explains knowledge flow and defines the relations between the actors within the system. The final section draws on the “from imitation to innovation” model apparently so successful in East Asia and assesses its applicability to Turkey. In assessing Turkey’s NIS, the focus is on the automotive and textile sectors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study was carried out with new lecturers on a two year Post Graduate Certificate in Learning and Teaching in Higher Education programme in a UK university. The aim was to establish their beliefs about how studying on the programme aligned with their teaching and learning philosophy and what, if anything, had changed or constrained those beliefs. Ten lecturers took part in an in-depth semi-structured interview. Content analysis of the transcripts suggested positive reactions to the programme but lecturers’ new insights were sometimes constrained by departments and university bureaucracy, particularly in the area of assessment. The conflicting roles of research and teaching were also a major issue facing these new professionals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current managed Everglades system, the pre-drainage, patterned mosaic of sawgrass ridges, sloughs and tree islands has been substantially altered or reduced largely as a result of human alterations to historic ecological and hydrological processes that sustained landscape patterns. The pre-compartmentalization ridge and slough landscape was a mosaic of sloughs, elongated sawgrass ridges (50-200m wide), and tree islands. The ridges and sloughs and tree islands were elongated in the direction of the water flow, with roughly equal area of ridge and slough. Over the past decades, the ridge-slough topographic relief and spatial patterning have degraded in many areas of the Everglades. Nutrient enriched areas have become dominated by Typha with little topographic relief; areas of reduced flow have lost the elongated ridge-slough topography; and ponded areas with excessively long hydroperiods have experienced a decline in ridge prevalence and shape, and in the number of tree islands (Sklar et al. 2004, Ogden 2005).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the 1950s the global consumption of natural resources has skyrocketed, both in magnitude and in the range of resources used. Closely coupled with emissions of greenhouse gases, land consumption, pollution of environmental media, and degradation of ecosystems, as well as with economic development, increasing resource use is a key issue to be addressed in order to keep the planet Earth in a safe and just operating space. This requires thinking about absolute reductions in resource use and associated environmental impacts, and, when put in the context of current re-focusing on economic growth at the European level, absolute decoupling, i.e., maintaining economic development while absolutely reducing resource use and associated environmental impacts. Changing behavioural, institutional and organisational structures that lock-in unsustainable resource use is, thus, a formidable challenge as existing world views, social practices, infrastructures, as well as power structures, make initiating change difficult. Hence, policy mixes are needed that will target different drivers in a systematic way. When designing policy mixes for decoupling, the effect of individual instruments on other drivers and on other instruments in a mix should be considered and potential negative effects be mitigated. This requires smart and time-dynamic policy packaging. This Special Issue investigates the following research questions: What is decoupling and how does it relate to resource efficiency and environmental policy? How can we develop and realize policy mixes for decoupling economic development from resource use and associated environmental impacts? And how can we do this in a systemic way, so that all relevant dimensions and linkages—including across economic and social issues, such as production, consumption, transport, growth and wellbeing­—are taken into account? In addressing these questions, the overarching goals of this Special Issue are to: address the challenges related to more sustainable resource-use; contribute to the development of successful policy tools and practices for sustainable development and resource efficiency (particularly through the exploration of socio-economic, scientific, and integrated aspects of sustainable development); and inform policy debates and policy-making. The Special Issue draws on findings from the EU and other countries to offer lessons of international relevance for policy mixes for more sustainable resource-use, with findings of interest to policy makers in central and local government and NGOs, decision makers in business, academics, researchers, and scientists.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a study that was undertaken to examine human interaction with a pedagogical agent and the passive and active detection of such agents within a synchronous, online environment. A pedagogical agent is a software application which can provide a human like interaction using a natural language interface. These may be familiar from the smartphone interfaces such as ‘Siri’ or ‘Cortana’, or the virtual online assistants found on some websites, such as ‘Anna’ on the Ikea website. Pedagogical agents are characters on the computer screen with embodied life-like behaviours such as speech, emotions, locomotion, gestures, and movements of the head, the eye, or other parts of the body. The passive detection test is where participants are not primed to the potential presence of a pedagogical agent within the online environment. The active detection test is where participants are primed to the potential presence of a pedagogical agent. The purpose of the study was to examine how people passively detected pedagogical agents that were presenting themselves as humans in an online environment. In order to locate the pedagogical agent in a realistic higher education online environment, problem-based learning online was used. Problem-based learning online provides a focus for discussions and participation, without creating too much artificiality. The findings indicated that the ways in which students positioned the agent tended to influence the interaction between them. One of the key findings was that since the agent was focussed mainly on the pedagogical task this may have hampered interaction with the students, however some of its non-task dialogue did improve students' perceptions of the autonomous agents’ ability to interact with them. It is suggested that future studies explore the differences between the relationships and interactions of learner and pedagogical agent within authentic situations, in order to understand if students' interactions are different between real and virtual mentors in an online setting.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the literature of construction risk modelling and assessment. It also reviews the real practice of risk assessment. The review resulted in significant results, summarised as follows. There has been a major shift in risk perception from an estimation variance into a project attribute. Although the Probability–Impact risk model is prevailing, substantial efforts are being put to improving it reflecting the increasing complexity of construction projects. The literature lacks a comprehensive assessment approach capable of capturing risk impact on different project objectives. Obtaining a realistic project risk level demands an effective mechanism for aggregating individual risk assessments. The various assessment tools suffer from low take-up; professionals typically rely on their experience. It is concluded that a simple analytical tool that uses risk cost as a common scale and utilises professional experience could be a viable option to facilitate closing the gap between theory and practice of risk assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The FIREDASS (FIRE Detection And Suppression Simulation) project is concerned with the development of fine water mist systems as a possible replacement for the halon fire suppression system currently used in aircraft cargo holds. The project is funded by the European Commission, under the BRITE EURAM programme. The FIREDASS consortium is made up of a combination of Industrial, Academic, Research and Regulatory partners. As part of this programme of work, a computational model has been developed to help engineers optimise the design of the water mist suppression system. This computational model is based on Computational Fluid Dynamics (CFD) and is composed of the following components: fire model; mist model; two-phase radiation model; suppression model and detector/activation model. The fire model - developed by the University of Greenwich - uses prescribed release rates for heat and gaseous combustion products to represent the fire load. Typical release rates have been determined through experimentation conducted by SINTEF. The mist model - developed by the University of Greenwich - is a Lagrangian particle tracking procedure that is fully coupled to both the gas phase and the radiation field. The radiation model - developed by the National Technical University of Athens - is described using a six-flux radiation model. The suppression model - developed by SINTEF and the University of Greenwich - is based on an extinguishment crietrion that relies on oxygen concentration and temperature. The detector/ activation model - developed by Cerberus - allows the configuration of many different detector and mist configurations to be tested within the computational model. These sub-models have been integrated by the University of Greenwich into the FIREDASS software package. The model has been validated using data from the SINTEF/GEC test campaigns and it has been found that the computational model gives good agreement with these experimental results. The best agreement is obtained at the ceiling which is where the detectors and misting nozzles would be located in a real system. In this paper the model is briefly described and some results from the validation of the fire and mist model are presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-ranking Chinese military officials are often quoted in international media as stating that China cannot afford to lose even an inch of Chinese territory, as this territory has been passed down from Chinese ancestors. Such statements are not new in Chinese politics, but recently this narrative has made an important transition. While previously limited to disputes over land borders, such rhetoric is now routinely applied to disputes involving islands and maritime borders. China is increasingly oriented toward its maritime borders and seems unwilling to compromise on delimitation disputes, a transition mirrored by many states across the globe. In a similar vein, scholarship has found that territorial disputes are particularly intractable and volatile when compared with other types of disputes, and a large body of research has grappled with producing systematic knowledge of territorial conflict. Yet in this wide body of literature, an important question has remained largely unanswered - how do states determine which geographical areas will be included in their territorial and maritime claims? In other words, if nations are willing to fight and die for an inch of national territory, how do governments draw the boundaries of the nation? This dissertation uses in-depth case studies of some of the most prominent territorial and maritime disputes in East Asia to argue that domestic political processes play a dominant and previously under-explored role in both shaping claims and determining the nature of territorial and maritime disputes. China and Taiwan are particularly well suited for this type of investigation, as they are separate claimants in multiple disputes, yet they both draw upon the same historical record when establishing and justifying their claims. Leveraging fieldwork in Taiwan, China, and the US, this dissertation includes in-depth case studies of China’s and Taiwan’s respective claims in both the South China Sea and East China Sea disputes. Evidence from this dissertation indicates that officials in both China and Taiwan have struggled with how to reconcile history and international law when establishing their claims, and that this struggle has introduced ambiguity into China's and Taiwan's claims. Amid this process, domestic political dynamics have played a dominant role in shaping the options available and the potential for claims to change in the future. In Taiwan’s democratic system, where national identity is highly contested through party politics, opinions vary along a broad spectrum as to the proper borders of the nation, and there is considerable evidence that Taiwan’s claims may change in the near future. In contrast, within China’s single-party authoritarian political system, where nationalism is source of regime legitimacy, views on the proper interpretation of China’s boundaries do vary, but along a much more narrow range. In the dissertation’s final chapter, additional cases, such as South Korea’s position on Dokdo and Indonesia’s approach to the defense of Natuna are used as points of comparison to further clarify theoretical findings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many maritime countries in Europe have implemented marine environmental monitoring programmes which include the measurement of chemical contaminants and related biological effects. How best to integrate data obtained in these two types of monitoring into meaningful assessments has been the subject of recent efforts by the International Council for Exploration of the Sea (ICES) Expert Groups. Work within these groups has concentrated on defining a core set of chemical and biological endpoints that can be used across maritime areas, defining confounding factors, supporting parameters and protocols for measurement. The framework comprised markers for concentrations of, exposure to and effects from, contaminants. Most importantly, assessment criteria for biological effect measurements have been set and the framework suggests how these measurements can be used in an integrated manner alongside contaminant measurements in biota, sediments and potentially water. Output from this process resulted in OSPAR Commission (www.ospar.org) guidelines that were adopted in 2012 on a trial basis for a period of 3 years. The developed assessment framework can furthermore provide a suitable approach for the assessment of Good Environmental Status (GES) for Descriptor 8 of the European Union (EU) Marine Strategy Framework Directive (MSFD).