281 resultados para Automatic classification


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper considers the debate about the relationship between globalization and media policy from the perspective provided by a current review of the Australian media classification scheme. Drawing upon the author’s recent experience in being ‘inside’ the policy process, as Lead Commissioner on the Australian National Classification Scheme Review, it is argued that theories of globalization – including theories of neoliberal globalization – fail to adequately capture the complexities of the reform process, particularly around the relationship between regulation and markets. The paper considers the pressure points for media content policies arising from media globalization, and the wider questions surrounding media content policies in an age of media convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A high sensitive fiber Bragg grating (FBG) strain sensor with automatic temperature compensation is demonstrated. FBG is axially linked with a stick and their free ends are fixed to the measured object. When the measured strain changes, the stick does not change in length, but the FBG does. When the temperature changes, the stick changes in length to pull the FBG to realize temperature compensation. In experiments, 1.45 times strain sensitivity of bare FBG with temperature compensation of less than 0.1 nm Bragg wavelength drift over 100 ◦C shift is achieved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electronic services are a leitmotif in ‘hot’ topics like Software as a Service, Service Oriented Architecture (SOA), Service oriented Computing, Cloud Computing, application markets and smart devices. We propose to consider these in what has been termed the Service Ecosystem (SES). The SES encompasses all levels of electronic services and their interaction, with human consumption and initiation on its periphery in much the same way the ‘Web’ describes a plethora of technologies that eventuate to connect information and expose it to humans. Presently, the SES is heterogeneous, fragmented and confined to semi-closed systems. A key issue hampering the emergence of an integrated SES is Service Discovery (SD). A SES will be dynamic with areas of structured and unstructured information within which service providers and ‘lay’ human consumers interact; until now the two are disjointed, e.g., SOA-enabled organisations, industries and domains are choreographed by domain experts or ‘hard-wired’ to smart device application markets and web applications. In a SES, services are accessible, comparable and exchangeable to human consumers closing the gap to the providers. This requires a new SD with which humans can discover services transparently and effectively without special knowledge or training. We propose two modes of discovery, directed search following an agenda and explorative search, which speculatively expands knowledge of an area of interest by means of categories. Inspired by conceptual space theory from cognitive science, we propose to implement the modes of discovery using concepts to map a lay consumer’s service need to terminologically sophisticated descriptions of services. To this end, we reframe SD as an information retrieval task on the information attached to services, such as, descriptions, reviews, documentation and web sites - the Service Information Shadow. The Semantic Space model transforms the shadow's unstructured semantic information into a geometric, concept-like representation. We introduce an improved and extended Semantic Space including categorization calling it the Semantic Service Discovery model. We evaluate our model with a highly relevant, service related corpus simulating a Service Information Shadow including manually constructed complex service agendas, as well as manual groupings of services. We compare our model against state-of-the-art information retrieval systems and clustering algorithms. By means of an extensive series of empirical evaluations, we establish optimal parameter settings for the semantic space model. The evaluations demonstrate the model’s effectiveness for SD in terms of retrieval precision over state-of-the-art information retrieval models (directed search) and the meaningful, automatic categorization of service related information, which shows potential to form the basis of a useful, cognitively motivated map of the SES for exploratory search.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of text classification techniques has been largely promoted in the past decade due to the increasing availability and widespread use of digital documents. Usually, the performance of text classification relies on the quality of categories and the accuracy of classifiers learned from samples. When training samples are unavailable or categories are unqualified, text classification performance would be degraded. In this paper, we propose an unsupervised multi-label text classification method to classify documents using a large set of categories stored in a world ontology. The approach has been promisingly evaluated by compared with typical text classification methods, using a real-world document collection and based on the ground truth encoded by human experts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is limited understanding about business strategies related to parliamentary government's departments. This study focuses on the strategies of departments of two state governments in Australia. The strategies are derived from department strategic plans available in public domain and collected from respective websites. The results of this research indicate that strategies fall into seven categories: internal, development, political, partnership, environment, reorientation and status quo. The strategies of the departments are mainly internal or development where development strategy is mainly the focus of departments such as transport, and infrastructure. Political strategy is prevalent for departments related to communities, and education and training. Further three layers of strategies are identified as kernel, cluster and individual, which are mapped to the developed taxonomy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Load in distribution networks is normally measured at the 11kV supply points; little or no information is known about the type of customers and their contributions to the load. This paper proposes statistical methods to decompose an unknown distribution feeder load to its customer load sector/subsector profiles. The approach used in this paper should assist electricity suppliers in economic load management, strategic planning and future network reinforcements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most suitable temperature range for domestic purposes is about 200C to 260C .Besides, both cold and hot water appear to be essential frequently for industrial purposes. In summer bringing down the water temperature at a comfortable range causes significant energy consumption. This project aims at saving energy to control water temperature by making water tank insulated .Therefore applying better insulation system which would reduce the disparity between the desired temperature and the actual temperature and hence saving energy significantly. Following the investigation, this project used cotton jacket to insulate the tank and the tank was placed under a paddy straw shade with a view to attaining the maximum energy saving. Finally, it has been found that reduction in energy consumption is to be about 50-60% which is quite satisfactory. Since comfortable temperature range varies from person to person this project thus combines insulating effect with automatic water heater.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing popularity of video consumption from mobile devices requires an effective video coding strategy. To overcome diverse communication networks, video services often need to maintain sustainable quality when the available bandwidth is limited. One of the strategy for a visually-optimised video adaptation is by implementing a region-of-interest (ROI) based scalability, whereby important regions can be encoded at a higher quality while maintaining sufficient quality for the rest of the frame. The result is an improved perceived quality at the same bit rate as normal encoding, which is particularly obvious at the range of lower bit rate. However, because of the difficulties of predicting region-of-interest (ROI) accurately, there is a limited research and development of ROI-based video coding for general videos. In this paper, the phase spectrum quaternion of Fourier Transform (PQFT) method is adopted to determine the ROI. To improve the results of ROI detection, the saliency map from the PQFT is augmented with maps created from high level knowledge of factors that are known to attract human attention. Hence, maps that locate faces and emphasise the centre of the screen are used in combination with the saliency map to determine the ROI. The contribution of this paper lies on the automatic ROI detection technique for coding a low bit rate videos which include the ROI prioritisation technique to give different level of encoding qualities for multiple ROIs, and the evaluation of the proposed automatic ROI detection that is shown to have a close performance to human ROI, based on the eye fixation data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article outlines the key recommendations of the Australian Law Reform Commission’s review of the National Classification Scheme, as outlined in its report Classification – Content Regulation and Convergent Media (ALRC, 2012). It identifies key contextual factors that underpin the need for reform of media classification laws and policies, including the fragmentation of regulatory responsibilities and the convergence of media platforms, content and services, as well as discussing the ALRC’s approach to law reform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing number of stratospheric particles available for study (via the U2 and/or WB57F collections), it is essential that a simple, yet rational, classification scheme be developed for general use. Such a scheme should be applicable to all particles collected from the stratosphere, rather than limited to only extraterrestial or chemical sub-groups. Criteria for the efficacy of such a scheme would include: (a) objectivity , (b) ease of use, (c) acceptance within the broader scientific community and (d) how well the classification provides intrinsic categories which are consistent with our knowledge of particle types present in the stratosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several investigators have recently proposed classification schemes for stratospheric dust particles [1-3]. In addition, extraterrestrial materials within stratospheric dust collections may be used as a measure of micrometeorite flux [4]. However, little attention has been given to the problems of the stratospheric collection as a whole. Some of these problems include: (a) determination of accurate particle abundances at a given point in time; (b) the extent of bias in the particle selection process; (c) the variation of particle shape and chemistry with size; (d) the efficacy of proposed classification schemes and (e) an accurate determination of physical parameters associated with the particle collection process (e.g. minimum particle size collected, collection efficiency, variation of particle density with time). We present here preliminary results from SEM, EDS and, where appropriate, XRD analysis of all of the particles from a collection surface which sampled the stratosphere between 18 and 20km in altitude. Determinations of particle densities from this study may then be used to refine models of the behavior of particles in the stratosphere [5].