90 resultados para Metadata schema
Resumo:
Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Theories of situated cognition indicate that interactive 3D representations of real work environments engage and prime the cognitive state of the viewer. In this paper, our major contribution is to augment a previous process elicitation methodology with virtual world context metadata, drawn from a 3D simulation of the workplace. We present a conceptual and formal approach for representing this contextual metadata, integrated into a process similarity measure that provides hints for the business analyst to use in later modelling steps. Finally, we conclude with examples from two use cases to illustrate the potential abilities of this approach.
Resumo:
The firm is faced with a decision concerning the nature of intra-organizational exchange relationships with internal human resources and the nature or inter-organizational exchange relationships with market firms. In both situations, the firm can develop an exchange that ranges from a discrete exchange to a relational exchange. Transaction Cost Economics (TCE) and the Resource Dependency View (RDV) represent alternative efficiency-based explanations fo the nature of the exchange relationship. The aim of the paper is to test these two theories in respect of air conditioning maintenance in retail centres. Multiple sources of information are genereated from case studies of Australian retail centres to test these theories in respoect of internalized operations management (concerning strategic aspects of air conditioning maintenance) and externalized planned routine air conditioning maintenance. The analysis of the data centres on pattern matching. It is concluded that the data supports TCE - on the basis of a development in TCE's contractual schema. Further research is suggested towards taking a pluralistic stance and developing a combined efficiency and power hypothesis - upon which Williamson has speculated. For practice, the conclusions also offer a timely cautionary note concerning the adoption of one approach in all exchange relationships.
Resumo:
There is currently a strong focus worldwide on the potential of large-scale Electronic Health Record (EHR) systems to cut costs and improve patient outcomes through increased efficiency. This is accomplished by aggregating medical data from isolated Electronic Medical Record databases maintained by different healthcare providers. Concerns about the privacy and reliability of Electronic Health Records are crucial to healthcare service consumers. Traditional security mechanisms are designed to satisfy confidentiality, integrity, and availability requirements, but they fail to provide a measurement tool for data reliability from a data entry perspective. In this paper, we introduce a Medical Data Reliability Assessment (MDRA) service model to assess the reliability of medical data by evaluating the trustworthiness of its sources, usually the healthcare provider which created the data and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record. The result is then expressed by manipulating health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
Electronic Health Record (EHR) systems are being introduced to overcome the limitations associated with paper-based and isolated Electronic Medical Record (EMR) systems. This is accomplished by aggregating medical data and consolidating them in one digital repository. Though an EHR system provides obvious functional benefits, there is a growing concern about the privacy and reliability (trustworthiness) of Electronic Health Records. Security requirements such as confidentiality, integrity, and availability can be satisfied by traditional hard security mechanisms. However, measuring data trustworthiness from the perspective of data entry is an issue that cannot be solved with traditional mechanisms, especially since degrees of trust change over time. In this paper, we introduce a Time-variant Medical Data Trustworthiness (TMDT) assessment model to evaluate the trustworthiness of medical data by evaluating the trustworthiness of its sources, namely the healthcare organisation where the data was created and the medical practitioner who diagnosed the patient and authorised entry of this data into the patient’s medical record, with respect to a certain period of time. The result can then be used by the EHR system to manipulate health record metadata to alert medical practitioners relying on the information to possible reliability problems.
Resumo:
Cultural objects are increasingly generated and stored in digital form, yet effective methods for their indexing and retrieval still remain an important area of research. The main problem arises from the disconnection between the content-based indexing approach used by computer scientists and the description-based approach used by information scientists. There is also a lack of representational schemes that allow the alignment of the semantics and context with keywords and low-level features that can be automatically extracted from the content of these cultural objects. This paper presents an integrated approach to address these problems, taking advantage of both computer science and information science approaches. We firstly discuss the requirements from a number of perspectives: users, content providers, content managers and technical systems. We then present an overview of our system architecture and describe various techniques which underlie the major components of the system. These include: automatic object category detection; user-driven tagging; metadata transform and augmentation, and an expression language for digital cultural objects. In addition, we discuss our experience on testing and evaluating some existing collections, analyse the difficulties encountered and propose ways to address these problems.
Resumo:
Purpose – The purpose of this study is to examine and extend Noer’s theoretical model of the new employment relationship. Design/methodology/approach – Case study methodology is used to scrutinise the model. The results of a literature-based survey on the elements underpinning the five values in the model are analysed from dual perspectives of individual and organization using a multi-source assessment instrument. A schema is developed to guide and inform a series of focus group discussions from an analysis of the survey data. Using content analysis, the transcripts from the focus group discussions are evaluated using the model’s values and their elements. The transcripts are also reviewed for implicit themes. The case studied is Flight Centre Limited, an Australian-based international retail travel company. Findings – Using this approach, some elements of the five values in Noer’s model are identified as characteristic of the company’s psychological contract. Specifically, to some extent, the model’s values of flexible deployment, customer focus, performance focus, project-based work, and human spirit and work can be applied in this case. A further analysis of the transcripts validates three additional values in the psychological contract literature: commitment; learning and development; and open information. As a result of the findings, Noer’s model is extended to eight values. Research limitations/implications – The study offers a research-based model of the new employment relationship. Since generalisations from the case study findings cannot be applied directly to other settings, the opportunity to test this model in a variety of contexts is open to other researchers. Originality/value – In practice, the methodology used is a unique process for benchmarking the psychological contract. The process may be applied in other business settings. By doing so, organization development professionals have a consulting framework for comparing an organization’s dominant psychological contract with the extended model presented here.
Resumo:
This paper reports on Years 8, 9 and 10 students’ knowledge of percent problem types, use of diagrams, and type of solution strategy. Non- and semi-proficient students displayed the expected inflexible formula approach to solution but proficient students used a flexible mixture of estimation, number sense and trial and error instead of expected schema based methods.
Resumo:
Centre for Mathematics and Science Education, QUT, Brisbane, Australia This paper reports on a study in which Years 6 and 10 students were individually interviewed to determine their ability to unitise and reunitise number lines used to represent mixed numbers and improper fractions. Only 16.7% of the students (all Year 6) were successful on all three tasks and, in general, Year 6 students outperformed Year 8 students. The interviews revealed that the remaining students had incomplete, fragmented or non-existent structural knowledge of mixed numbers and improper fractions, and were unable to unitise or reunitise number lines. The implication for teaching is that instruction should focus on providing students with a variety of fraction representations in order to develop rich and flexible schema for all fraction types (mixed numbers, and proper and improper fractions).
Resumo:
We discuss issues and opportunities for designing experiences with 3D simulations of nature where the landscape and the interactant engage in an equitable dialogue. We consider the way digital representations of the world and design habits tend to detach from corporeal dimensions in experiencing the natural world and perpetuate motifs in games that reflect taming, territorializing or defending ourselves from nature. We reflect on the Digital Songlines project, which translates the schema of indigenous people to construct a natural environment, and the inherent difficulty in cross-culturally representing inter-connectedness. This leads us to discuss insights into the use of natural features by western people in cultural transmission and in their experiences in natural places. We propose McCarthy and Wright's dialogical approach may reconcile conceptions of place and self in design and conclude by considering experiments in which designers digitally reconstruct their own corporeal experience in natural physical landscape.
Resumo:
This approach to sustainable design explores the possibility of creating an architectural design process which can iteratively produce optimised and sustainable design solutions. Driven by an evolution process based on genetic algorithms, the system allows the designer to “design the building design generator” rather than to “designs the building”. The design concept is abstracted into a digital design schema, which allows transfer of the human creative vision into the rational language of a computer. The schema is then elaborated into the use of genetic algorithms to evolve innovative, performative and sustainable design solutions. The prioritisation of the project’s constraints and the subsequent design solutions synthesised during design generation are expected to resolve most of the major conflicts in the evaluation and optimisation phases. Mosques are used as the example building typology to ground the research activity. The spatial organisations of various mosque typologies are graphically represented by adjacency constraints between spaces. Each configuration is represented by a planar graph which is then translated into a non-orthogonal dual graph and fed into the genetic algorithm system with fixed constraints and expected performance criteria set to govern evolution. The resultant Hierarchical Evolutionary Algorithmic Design System is developed by linking the evaluation process with environmental assessment tools to rank the candidate designs. The proposed system generates the concept, the seed, and the schema, and has environmental performance as one of the main criteria in driving optimisation.
Resumo:
Literally, the word compliance suggests conformity in fulfilling official requirements. The thesis presents the results of the analysis and design of a class of protocols called compliant cryptologic protocols (CCP). The thesis presents a notion for compliance in cryptosystems that is conducive as a cryptologic goal. CCP are employed in security systems used by at least two mutually mistrusting sets of entities. The individuals in the sets of entities only trust the design of the security system and any trusted third party the security system may include. Such a security system can be thought of as a broker between the mistrusting sets of entities. In order to provide confidence in operation for the mistrusting sets of entities, CCP must provide compliance verification mechanisms. These mechanisms are employed either by all the entities or a set of authorised entities in the system to verify the compliance of the behaviour of various participating entities with the rules of the system. It is often stated that confidentiality, integrity and authentication are the primary interests of cryptology. It is evident from the literature that authentication mechanisms employ confidentiality and integrity services to achieve their goal. Therefore, the fundamental services that any cryptographic algorithm may provide are confidentiality and integrity only. Since controlling the behaviour of the entities is not a feasible cryptologic goal,the verification of the confidentiality of any data is a futile cryptologic exercise. For example, there exists no cryptologic mechanism that would prevent an entity from willingly or unwillingly exposing its private key corresponding to a certified public key. The confidentiality of the data can only be assumed. Therefore, any verification in cryptologic protocols must take the form of integrity verification mechanisms. Thus, compliance verification must take the form of integrity verification in cryptologic protocols. A definition of compliance that is conducive as a cryptologic goal is presented as a guarantee on the confidentiality and integrity services. The definitions are employed to provide a classification mechanism for various message formats in a cryptologic protocol. The classification assists in the characterisation of protocols, which assists in providing a focus for the goals of the research. The resulting concrete goal of the research is the study of those protocols that employ message formats to provide restricted confidentiality and universal integrity services to selected data. The thesis proposes an informal technique to understand, analyse and synthesise the integrity goals of a protocol system. The thesis contains a study of key recovery,electronic cash, peer-review, electronic auction, and electronic voting protocols. All these protocols contain message format that provide restricted confidentiality and universal integrity services to selected data. The study of key recovery systems aims to achieve robust key recovery relying only on the certification procedure and without the need for tamper-resistant system modules. The result of this study is a new technique for the design of key recovery systems called hybrid key escrow. The thesis identifies a class of compliant cryptologic protocols called secure selection protocols (SSP). The uniqueness of this class of protocols is the similarity in the goals of the member protocols, namely peer-review, electronic auction and electronic voting. The problem statement describing the goals of these protocols contain a tuple,(I, D), where I usually refers to an identity of a participant and D usually refers to the data selected by the participant. SSP are interested in providing confidentiality service to the tuple for hiding the relationship between I and D, and integrity service to the tuple after its formation to prevent the modification of the tuple. The thesis provides a schema to solve the instances of SSP by employing the electronic cash technology. The thesis makes a distinction between electronic cash technology and electronic payment technology. It will treat electronic cash technology to be a certification mechanism that allows the participants to obtain a certificate on their public key, without revealing the certificate or the public key to the certifier. The thesis abstracts the certificate and the public key as the data structure called anonymous token. It proposes design schemes for the peer-review, e-auction and e-voting protocols by employing the schema with the anonymous token abstraction. The thesis concludes by providing a variety of problem statements for future research that would further enrich the literature.
Resumo:
Queensland University of Technology (QUT) completed an Australian National Data Service (ANDS) funded “Seeding the Commons Project” to contribute metadata to Research Data Australia. The project employed two Research Data Librarians from October 2009 through to July 2010. Technical support for the project was provided by QUT’s High Performance Computing and Research Support Specialists. ---------- The project identified and described QUT’s category 1 (ARC / NHMRC) research datasets. Metadata for the research datasets was stored in QUT’s Research Data Repository (Architecta Mediaflux). Metadata which was suitable for inclusion in Research Data Australia was made available to the Australian Research Data Commons (ARDC) in RIF-CS format. ---------- Several workflows and processes were developed during the project. 195 data interviews took place in connection with 424 separate research activities which resulted in the identification of 492 datasets. ---------- The project had a high level of technical support from QUT High Performance Computing and Research Support Specialists who developed the Research Data Librarian interface to the data repository that enabled manual entry of interview data and dataset metadata, creation of relationships between repository objects. The Research Data Librarians mapped the QUT metadata repository fields to RIF-CS and an application was created by the HPC and Research Support Specialists to generate RIF-CS files for harvest by the Australian Research Data Commons (ARDC). ---------- This poster will focus on the workflows and processes established for the project including: ---------- • Interview processes and instruments • Data Ingest from existing systems (including mapping to RIF-CS) • Data entry and the Data Librarian interface to Mediaflux • Verification processes • Mapping and creation of RIF-CS for the ARDC