862 resultados para publicly verifiable
Resumo:
Facial expression recognition (FER) algorithms mainly focus on classification into a small discrete set of emotions or representation of emotions using facial action units (AUs). Dimensional representation of emotions as continuous values in an arousal-valence space is relatively less investigated. It is not fully known whether fusion of geometric and texture features will result in better dimensional representation of spontaneous emotions. Moreover, the performance of many previously proposed approaches to dimensional representation has not been evaluated thoroughly on publicly available databases. To address these limitations, this paper presents an evaluation framework for dimensional representation of spontaneous facial expressions using texture and geometric features. SIFT, Gabor and LBP features are extracted around facial fiducial points and fused with FAP distance features. The CFS algorithm is adopted for discriminative texture feature selection. Experimental results evaluated on the publicly accessible NVIE database demonstrate that fusion of texture and geometry does not lead to a much better performance than using texture alone, but does result in a significant performance improvement over geometry alone. LBP features perform the best when fused with geometric features. Distributions of arousal and valence for different emotions obtained via the feature extraction process are compared with those obtained from subjective ground truth values assigned by viewers. Predicted valence is found to have a more similar distribution to ground truth than arousal in terms of covariance or Bhattacharya distance, but it shows a greater distance between the means.
Resumo:
This work details the results of a face authentication test (FAT2004) (http://www.ee.surrey.ac.uk/banca/icpr2004) held in conjunction with the 17th International Conference on Pattern Recognition. The contest was held on the publicly available BANCA database (http://www.ee.surrey.ac.uk/banca) according to a defined protocol (E. Bailly-Bailliere et al., June 2003). The competition also had a sequestered part in which institutions had to submit their algorithms for independent testing. 13 different verification algorithms from 10 institutions submitted results. Also, a standard set of face recognition software packages from the Internet (http://www.cs.colostate.edu/evalfacerec) were used to provide a baseline performance measure.
Resumo:
Distributed Denial-of-Service (DDoS) attacks continue to be one of the most pernicious threats to the delivery of services over the Internet. Not only are DDoS attacks present in many guises, they are also continuously evolving as new vulnerabilities are exploited. Hence accurate detection of these attacks still remains a challenging problem and a necessity for ensuring high-end network security. An intrinsic challenge in addressing this problem is to effectively distinguish these Denial-of-Service attacks from similar looking Flash Events (FEs) created by legitimate clients. A considerable overlap between the general characteristics of FEs and DDoS attacks makes it difficult to precisely separate these two classes of Internet activity. In this paper we propose parameters which can be used to explicitly distinguish FEs from DDoS attacks and analyse two real-world publicly available datasets to validate our proposal. Our analysis shows that even though FEs appear very similar to DDoS attacks, there are several subtle dissimilarities which can be exploited to separate these two classes of events.
Resumo:
The most common human cancers are malignant neoplasms of the skin. Incidence of cutaneous melanoma is rising especially steeply, with minimal progress in non-surgical treatment of advanced disease. Despite significant effort to identify independent predictors of melanoma outcome, no accepted histopathological, molecular or immunohistochemical marker defines subsets of this neoplasm. Accordingly, though melanoma is thought to present with different 'taxonomic' forms, these are considered part of a continuous spectrum rather than discrete entities. Here we report the discovery of a subset of melanomas identified by mathematical analysis of gene expression in a series of samples. Remarkably, many genes underlying the classification of this subset are differentially regulated in invasive melanomas that form primitive tubular networks in vitro, a feature of some highly aggressive metastatic melanomas. Global transcript analysis can identify unrecognized subtypes of cutaneous melanoma and predict experimentally verifiable phenotypic characteristics that may be of importance to disease progression.
Resumo:
Purpose Maintenance management is a core process in infrastructure asset management. Infrastructure organisations must constantly strive to ensure the effectiveness of this process in order to obtain the greatest lifetime value from their infrastructure assets. This paper aims to investigate how infrastructure organisations can enhance the effectiveness of their maintenance management process. Approach This study utilised multiple case studies as the research approach. The case organisations were asked to identify the challenges faced in the maintenance process and the approaches they have adopted to overcome these challenges. Analysis of these findings, together with deductive reasoning, leads to the development of the proposed capability needed for effective maintenance management process. Findings The case studies reveal that maintenance management process is a core process in ensuring that infrastructure assets are optimally and functionally available to support business operations. However, the main challenge is the lack of skilled and experienced personnel to understand and anticipate maintenance requirement. A second challenge is the reduced window of time available to carry out inspection and maintenance works. To overcome these challenges, the case organisations have invested in technologies. However, technologies available to facilitate this process are complex and constantly changing. Consequently, there is a need for infrastructure organizations to develop their technological absorptive capability, i.e. the ability to embrace and capitalize on new technologies to enhance their maintenance management process. Originality/Value This paper is original in that it provides empirical evidence to identify technological absorptive capability as core to improving the maintenance management process. The findings are valuable because it sheds light on where infrastructure organisation, regardless of whether they are privately or publicly owned, should channel their scarce resources. The development of the core capability will ensure that the maintenance process can contribute value to their organisation.
Resumo:
Notwithstanding the obvious potential advantages of information and communications technology (ICT) in the enhanced provision of healthcare services, there are some concerns associated with integration of and access to electronic health records. A security violation in health records, such as an unauthorised disclosure or unauthorised alteration of an individual's health information, can significantly undermine both healthcare providers' and consumers' confidence and trust in e-health systems. A crisis in confidence in any national level e-health system could seriously degrade the realisation of the system's potential benefits. In response to the privacy and security requirements for the protection of health information, this research project investigated national and international e-health development activities to identify the necessary requirements for the creation of a trusted health information system architecture consistent with legislative and regulatory requirements and relevant health informatics standards. The research examined the appropriateness and sustainability of the current approaches for the protection of health information. It then proposed an architecture to facilitate the viable and sustainable enforcement of privacy and security in health information systems under the project title "Open and Trusted Health Information Systems (OTHIS)". OTHIS addresses necessary security controls to protect sensitive health information when such data is at rest, during processing and in transit with three separate and achievable security function-based concepts and modules: a) Health Informatics Application Security (HIAS); b) Health Informatics Access Control (HIAC); and c) Health Informatics Network Security (HINS). The outcome of this research is a roadmap for a viable and sustainable architecture for providing robust protection and security of health information including elucidations of three achievable security control subsystem requirements within the proposed architecture. The successful completion of two proof-of-concept prototypes demonstrated the comprehensibility, feasibility and practicality of the HIAC and HIAS models for the development and assessment of trusted health systems. Meanwhile, the OTHIS architecture has provided guidance for technical and security design appropriate to the development and implementation of trusted health information systems whilst simultaneously offering guidance for ongoing research projects. The socio-economic implications of this research can be summarised in the fact that this research embraces the need for low cost security strategies against economic realities by using open-source technologies for overall test implementation. This allows the proposed architecture to be publicly accessible, providing a platform for interoperability to meet real-world application security demands. On the whole, the OTHIS architecture sets a high level of security standard for the establishment and maintenance of both current and future health information systems. This thereby increases healthcare providers‘ and consumers‘ trust in the adoption of electronic health records to realise the associated benefits.
Resumo:
The Malaysian National Innovation Model blueprint states that there is an urgent need to pursue an innovation-oriented economy to improve the nation’s capacity for knowledge, creativity and innovation. In nurturing a pervasive innovation culture, the Malaysian government has declared the year 2010 as an Innovative Year whereby creativity among its population is highly celebrated. However, while Malaysian citizens are encouraged to be creative and innovative, scientific data and information generated from publicly funded research in Malaysia is locked up because of rigid intellectual property licensing regimes and traditional publishing models. Reflecting on these circumstances, this paper looks at, and argue why, scientific data and information should be made available, accessible and re-useable freely to promote the grassroots level of innovation in Malaysia. Using innovation theory as its platform of argument, this paper calls for an open access policy for publicly funded research output to be adopted and implemented in Malaysia. Simultaneously, a normative analytic approach is used to determine the types of open access policy that ought to be adopted to spur greater innovation among Malaysians.
Resumo:
Objective: Hospital EDs are a significant and high-profile component of Australia’s health-care system, which in recent years have experienced considerable crowding. This crowding is caused by the combination of increasing demand, throughput and output factors. The aim of the present article is to clarify trends in the use of public ED services across Australia with a view to providing an evidence basis for future policy analysis and discussion. Methods: The data for the present article have been extracted, compiled and analysed from publicly available sources for a 10 year period between 2000–2001 and 2009–2010. Results: Demand for public ED care increased by 37% over the decade, an average annual increase of 1.8% in the utilization rate per 1000 persons. There were significant differences in utilization rates and in trends in growth among states and territories that do not easily relate to general population trends alone. Conclusions: This growth in demand exceeds general population growth, and the variability between states both in utilization rates and overall trends defies immediate explanation. The growth in demand for ED services is a partial contributor to the crowding being experienced in EDs across Australia. There is a need for more detailed study, including qualitative analysis of patient motivations in order to identify the factors driving this growth in demand.
Resumo:
This paper critically analyses the proposed Australian regulatory approach to the crediting of biological sequestration activities (biosequestration) under the Australian Carbon Farming Initiative and its interaction with State-based carbon rights, the national carbon-pricing mechanism, and the international Kyoto Protocol and carbon-trading markets. Norms and principles have been established by the Kyoto Protocol to guide the creation of additional, verifiable, and permanent credits from biosequestration activities. This paper examines the proposed arrangements under the Australian Carbon Farming Initiative and Carbon Pricing Mechanism to determine whether they are consistent with those international norms and standards. This paper identifies a number of anomalies associated with the legal treatment of additionality and permanence and issuance of carbon credits within the Australian schemes. In light of this, the paper considers the possible legal implications for the national and international transfer, surrender and use of these offset credits.
Resumo:
A rule-based approach for classifying previously identified medical concepts in the clinical free text into an assertion category is presented. There are six different categories of assertions for the task: Present, Absent, Possible, Conditional, Hypothetical and Not associated with the patient. The assertion classification algorithms were largely based on extending the popular NegEx and Context algorithms. In addition, a health based clinical terminology called SNOMED CT and other publicly available dictionaries were used to classify assertions, which did not fit the NegEx/Context model. The data for this task includes discharge summaries from Partners HealthCare and from Beth Israel Deaconess Medical Centre, as well as discharge summaries and progress notes from University of Pittsburgh Medical Centre. The set consists of 349 discharge reports, each with pairs of ground truth concept and assertion files for system development, and 477 reports for evaluation. The system’s performance on the evaluation data set was 0.83, 0.83 and 0.83 for recall, precision and F1-measure, respectively. Although the rule-based system shows promise, further improvements can be made by incorporating machine learning approaches.
Resumo:
Airports, whether publicly or privately owned or operated fill both public and private roles. They need to act as public infrastructure providers and as businesses which cover their operating costs. That leads to special governance concerns with respect to consumers and competitors which are only beginning to be addressed. These challenges are highlighted both by shifts in ownership status and by the expansion of roles performed by airports as passenger and cargo volumes continue to increase and as nearby urban areas expand outward towards airports. We survey five ways in which the regulatory shoe doesn‟t quite fit the needs. Our findings suggest that, while ad hoc measures limit political tension, new governance measures are needed.
Resumo:
It is important to try to come to grips with what content and applications are likely to be feasible, popular and beneficial on the National Broadband Network, which is being rolled out now. This short article looks at the three main types of content ('unmanaged', 'managed' and 'publicly supported' services), shows how creative content is being, or could be, deployed across all three, and discusses the policy opportunities and challenges for content industries in connecting with what Minister for Regional Australia, Regional Development and Local Government and Minister for the Arts Simon Crean calls 'the largest cultural infrastructure project Australia has ever seen'.
Resumo:
Australia should seek new and liberating ways to bring together the arts, popular culture and the creative industries, according to Arts and creative industries. The report, funded by the Australia Council for the Arts and prepared by Professor Justin O’Connor of the Creative Industries Faculty at Queensland University of Technology, looks at ways in which the policy relationship between these often polarised sectors of arts and creative industries might be re-thought and approached more productively. The report is in two parts, commencing with An Australian conversation, in which Professor O’Connor, with Stuart Cunningham and Luke Jaaniste, document a series of in depth interviews with 18 leading practitioners across the creative industries. They discuss their perceptions of the similarities, differences and connections between the arts and creative industries. The interviews frequently returned to the fundamental question of what was meant by ‘art’ and ‘creative industries’. The second, larger part of Arts and creative industries, addresses this question through an extensive review of the discussions of art and its relation to society and culture over the last few centuries. A historical overview highlights the importance that art has had in developing our comprehension of the modern world. It also examines the enthusiasm for the creative industries over the last 15 years or so and the impact this has had on creative policy-making. Arts and creative industries suggests there is no dividing line between publicly-funded arts, popular culture and the blossoming businesses of the creative sector – and national policy should reflect this. This study was commissioned by the Australia Council as part of a long-running and productive relationship between the council and the ARC Centre of Excellence on Creative Industries and Innovation at the Queensland University of Technology.
Resumo:
In the era of Web 2.0, huge volumes of consumer reviews are posted to the Internet every day. Manual approaches to detecting and analyzing fake reviews (i.e., spam) are not practical due to the problem of information overload. However, the design and development of automated methods of detecting fake reviews is a challenging research problem. The main reason is that fake reviews are specifically composed to mislead readers, so they may appear the same as legitimate reviews (i.e., ham). As a result, discriminatory features that would enable individual reviews to be classified as spam or ham may not be available. Guided by the design science research methodology, the main contribution of this study is the design and instantiation of novel computational models for detecting fake reviews. In particular, a novel text mining model is developed and integrated into a semantic language model for the detection of untruthful reviews. The models are then evaluated based on a real-world dataset collected from amazon.com. The results of our experiments confirm that the proposed models outperform other well-known baseline models in detecting fake reviews. To the best of our knowledge, the work discussed in this article represents the first successful attempt to apply text mining methods and semantic language models to the detection of fake consumer reviews. A managerial implication of our research is that firms can apply our design artifacts to monitor online consumer reviews to develop effective marketing or product design strategies based on genuine consumer feedback posted to the Internet.
Resumo:
This study examines how both the level and the nature of environmental information voluntarily disclosed by Australian firms relate to their underlying environmental performance. Disclosure is scored using an index developed by Clarkson et al. (2008) based on Global Reporting Initiative (GRI) Guidelines and the environmental performance measure is based on emission data available from the National Pollutant Inventory (NPI). The sample consists of 51 firms that reported to the NPI in both 2002 and 2006. The findings are as follows. First, descriptive statistics indicate that while there was modest improvement in disclosure between 2002 and 2006, the highest disclosure score obtained was just slightly in excess of 50% of the maximum available based on the GRI Guidelines. Second, the results consistently indicate that not only do firms with a higher pollution propensity disclose more environmental information; they also rely on disclosures that the GRI views as inherently more objective and verifiable. Taken together, these results suggest that concerns regarding the reliability of voluntary environmental disclosures in the Australian context remain valid and thereby potentially signal a need for both enhanced mandatory reporting requirements and improved enforcement. In this regard, our study also informs regulatory policy on mandatory disclosures of environmental performance.