730 resultados para pacs: it trainings requirements


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – Building project management requires real time flow of information between all the project team members or the supply chain members. In the present scenario, when project participants are geographically separated, adoption of Information Communication Technology (ICT) enables such effective communication. But strategic adoption of ICT requires that all the supply chain members follow the accepted methods of communication or the communication protocols. The majority of the construction organizations are small and medium enterprises (SMEs). This research, therefore, proposes to focus on developing IT-enhanced communication protocols for building project management by SMEs. Design/methodology/approach – The research adopts a sequential mixed methods approach, where data collection and analysis are conducted in both the quantitative and qualitative phases of research. Findings – The protocols are proposed as a “Strategic Model for Enhancing ICT Diffusion in Building Projects”. The framework for the model is discussed at three levels of study, i.e industry, organization, and people. Practical implications – While the research was conducted in an Indian context, the research outcome is envisaged to be widely applicable in other countries with due considerations. Originality/value – The developed framework has implications for national level bodies and academic institutions, organizations, people or project managers and is applicable at the international level after due considerations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Berridge's model (e.g. [Berridge KC. Food reward: Brain substrates of wanting and liking. Neurosci Biobehav Rev 1996;20:1–25.; Berridge KC, Robinson T E. Parsing reward. Trends Neurosci 2003;26:507–513.; Berridge KC. Motivation concepts in behavioral neuroscience. Physiol Behav 2004;81:179–209]) outlines the brain substrates thought to mediate food reward with distinct ‘liking’ (hedonic/affective) and ‘wanting’ (incentive salience/motivation) components. Understanding the dual aspects of food reward could throw light on food choice, appetite control and overconsumption. The present study reports the development of a procedure to measure these processes in humans. A computer-based paradigm was used to assess ‘liking’ (through pleasantness ratings) and ‘wanting’ (through forced-choice photographic procedure) for foods that varied in fat (high or low) and taste (savoury or sweet). 60 participants completed the program when hungry and after an ad libitum meal. Findings indicate a state (hungry–satiated)-dependent, partial dissociation between ‘liking’ and ‘wanting’ for generic food categories. In the hungry state, participants ‘wanted’ high-fat savoury > low-fat savoury with no corresponding difference in ‘liking’, and ‘liked’ high-fat sweet > low-fat sweet but did not differ in ‘wanting’ for these foods. In the satiated state, participants ‘liked’, but did not ‘want’, high-fat savoury > low-fat savoury, and ‘wanted’ but did not ‘like’ low-fat sweet > high-fat sweet. More differences in ‘liking’ and ‘wanting’ were observed when hungry than when satiated. This procedure provides the first step in proof of concept that ‘liking’ and ‘wanting’ can be dissociated in humans and can be further developed for foods varying along different dimensions. Other experimental procedures may also be devised to separate ‘liking’ and ‘wanting’.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dentists have the privilege of possessing, administering and prescribing drugs, including highly addictive medications, to their patients. But because drugs are often vulnerable to being abused by all members of society, including dentists and their patients, and because drugs can be dangerous, they are tightly regulated in Canada by the federal and provincial/territorial governments. Regulatory and professional dental bodies also provide guidance for their members about how to best administer and prescribe drugs. This chapter outlines the regulation by federal and provincial/territorial governments in this area, examines the professional practice requirements set out by regulatory/professional bodies and the issue of drug abuse by dental professional and patients. It is important to note from the outset that governmental and professional regulations, policies and practices differ from province to province and territory to territory. This chapter aims to alert dentists to possible legal and professional issues surrounding the possession, administration and prescription of drugs. For detailed specific information about regulation, policies, ethical standards and professional practice standards in Canada or their province/ territory, dentists should contact their insurer or professional association.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We introduce the concept of attribute-based authenticated key exchange (AB-AKE) within the framework of ciphertext policy attribute-based systems. A notion of AKE-security for AB-AKE is presented based on the security models for group key exchange protocols and also taking into account the security requirements generally considered in the ciphertext policy attribute-based setting. We also extend the paradigm of hybrid encryption to the ciphertext policy attribute-based encryption schemes. A new primitive called encapsulation policy attribute-based key encapsulation mechanism (EP-AB-KEM) is introduced and a notion of chosen ciphertext security is de�ned for EP-AB-KEMs. We propose an EP-AB-KEM from an existing attribute-based encryption scheme and show that it achieves chosen ciphertext security in the generic group and random oracle models. We present a generic one-round AB-AKE protocol that satis�es our AKE-security notion. The protocol is generically constructed from any EP-AB-KEM that satis�es chosen ciphertext security. Instantiating the generic AB-AKE protocol with our EP-AB-KEM will result in a concrete one-round AB-AKE protocol also secure in the generic group and random oracle models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we argue for an experientially grounded view of IT professionals’ ethical formation and support. We propose that for such formation and support to be effectual, it should challenge professionals’ conceptualisations of their field and of ethics, and it should do so with the aim of changing their experience. To this end, we present a Model of Ethical IT, which is based on an examination of the nature of ethics and on empirical findings concerning IT professionals’ experience of ethics. We argue that for IT professionals to be enabled to become more ethical in their practice: the purpose of IT must be primarily understood to be user-oriented; the nature of professional ethics must be primarily understood to be other-centred; and the goal of ethics education must be understood as primarily promoting a change in awareness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What happens when the traditional framing mechanisms of our performance environments are removed and we are forced as directors to work with actors in digital environments that capture performance in 360 degrees? As directors contend with the challenges of interactive performance, the emergence of the online audience and the powerful influence of the games industry, how can we approach the challenges of directing work that is performance captured and presented in real time using motion capture and associated 3D imaging software? The 360 degree real time capture of performance, while allowing for an unlimited amount of framing potential, demands a unique and uncompromisingly disciplined style of direction and performance that has thus far remained unstudied and unquantified. By a close analysis of the groundbreaking work of artists like Robert Zemeckis and the Wetta Digital studio it is possible to begin to quantify what the technical requirements and challenges of 360 degree direction might be, but little has been discovered about the challenges of communicating the unlimited potential of framing and focus to the actors who work with these directors within these systems. It cannot be argued that the potential of theatrical space has evolved beyond the physical and moved into a more accessible virtual and digitised form, so how then can we direct for this unlimited potential and where do we place the focus of our directed (and captured) performance?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hollywood has dominated the global film business since the First World War. Economic formulas used by governments to assess levels of industry dominance typically measure market share to establish the degree of industry concentration. The business literature reveals that a marketing orientation strongly correlates with superior market performance and that market leaders that possess a set of six superior marketing capabilities are able to continually outperform rival firms. This paper argues that the historical evidence shows that the Hollywood Majors have consistently outperformed rival firms and rival film industries in each of those six marketing capabilities and that unless rivals develop a similarly integrated and cohesive strategic marketing management approach to the movie business and match the Major studios’ superior capabilities, then Hollywood’s dominance will continue. This paper also proposes that in cyberspace, whilst the Internet does provide a channel that democratises film distribution, the flat landscape of the world wide web means that in order to stand out from the clutter of millions of cyber-voices seeking attention, independent film companies need to possess superior strategic marketing management capabilities and develop effective e-marketing strategies to find a niche, attract a loyal online audience and prosper. However, mirroring a recent CIA report forecasting a multi-polar world economy, this paper also argues that potentially serious longer-term rivals are emerging and will increasingly take a larger slice of an expanding global box office as India, China and other major developing economies and their respective cultural channels grow and achieve economic parity with or surpass the advanced western economies. Thus, in terms of global market share over time, Hollywood’s slice of the pie will comparatively diminish in an emerging multi-polar movie business.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An Asset Management (AM) life-cycle constitutes a set of processes that align with the development, operation and maintenance of assets, in order to meet the desired requirements and objectives of the stake holders of the business. The scope of AM is often broad within an organization due to the interactions between its internal elements such as human resources, finance, technology, engineering operation, information technology and management, as well as external elements such as governance and environment. Due to the complexity of the AM processes, it has been proposed that in order to optimize asset management activities, process modelling initiatives should be adopted. Although organisations adopt AM principles and carry out AM initiatives, most do not document or model their AM processes, let alone enacting their processes (semi-) automatically using a computer-supported system. There is currently a lack of knowledge describing how to model AM processes through a methodical and suitable manner so that the processes are streamlines and optimized and are ready for deployment in a computerised way. This research aims to overcome this deficiency by developing an approach that will aid organisations in constructing AM process models quickly and systematically whilst using the most appropriate techniques, such as workflow technology. Currently, there is a wealth of information within the individual domains of AM and workflow. Both fields are gaining significant popularity in many industries thus fuelling the need for research in exploring the possible benefits of their cross-disciplinary applications. This research is thus inspired to investigate these two domains to exploit the application of workflow to modelling and execution of AM processes. Specifically, it will investigate appropriate methodologies in applying workflow techniques to AM frameworks. One of the benefits of applying workflow models to AM processes is to adapt and enable both ad-hoc and evolutionary changes over time. In addition, this can automate an AM process as well as to support the coordination and collaboration of people that are involved in carrying out the process. A workflow management system (WFMS) can be used to support the design and enactment (i.e. execution) of processes and cope with changes that occur to the process during the enactment. So far few literatures can be found in documenting a systematic approach to modelling the characteristics of AM processes. In order to obtain a workflow model for AM processes commonalities and differences between different AM processes need to be identified. This is the fundamental step in developing a conscientious workflow model for AM processes. Therefore, the first stage of this research focuses on identifying the characteristics of AM processes, especially AM decision making processes. The second stage is to review a number of contemporary workflow techniques and choose a suitable technique for application to AM decision making processes. The third stage is to develop an intermediate ameliorated AM decision process definition that improves the current process description and is ready for modelling using the workflow language selected in the previous stage. All these lead to the fourth stage where a workflow model for an AM decision making process is developed. The process model is then deployed (semi-) automatically in a state-of-the-art WFMS demonstrating the benefits of applying workflow technology to the domain of AM. Given that the information in the AM decision making process is captured at an abstract level within the scope of this work, the deployed process model can be used as an executable guideline for carrying out an AM decision process in practice. Moreover, it can be used as a vanilla system that, once being incorporated with rich information from a specific AM decision making process (e.g. in the case of a building construction or a power plant maintenance), is able to support the automation of such a process in a more elaborated way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this research I have examined how ePortfolios can be designed for Music postgraduate study through a practice led research enquiry. This process involved designing two Web 2.0 ePortfolio systems for a group of five post graduate music research students. The design process revolved around the application of an iterative methodology called Software Develop as Research (SoDaR) that seeks to simultaneously develop design and pedagogy. The approach to designing these ePortfolio systems applied four theoretical protocols to examine the use of digitised artefacts in ePortfolio systems to enable a dynamic and inclusive dialogue around representations of the students work. The research and design process involved an analysis of existing software and literature with a focus upon identifying the affordances of available Web 2.0 software and the applications of these ideas within 21st Century life. The five post graduate music students each posed different needs in relation to the management of digitised artefacts and the communication of their work amongst peers, supervisors and public display. An ePortfolio was developed for each of them that was flexible enough to address their needs within the university setting. However in this first SoDaR iteration data gathering phase I identified aspects of the university context that presented a negative case that impacted upon the design and usage of the ePortfolios and prevented uptake. Whilst the portfolio itself functioned effectively, the university policies and technical requirements prevented serious use. The negative case analysis of the case study found revealed that Access and Control and Implementation, Technical and Policy Constraints protocols where limiting user uptake. From the semistructured interviews carried out as part of this study participant feedback revealed that whilst the participants did not use the ePortfolio system I designed, each student was employing Web 2.0 social networking and storage processes in their lives and research. In the subsequent iterations I then designed a more ‘ideal’ system that could be applied outside of the University context that draws upon the employment of these resources. In conclusion I suggest recommendations about ePortfolio design that considers what the applications of the theoretical protocols reveal about creative arts settings. The transferability of these recommendations are of course dependent upon the reapplication of the theoretical protocols in a new context. To address the mobility of ePortfolio design between Institutions and wider settings I have also designed a prototype for a business card sized USB portal for the artists’ ePortfolio. This research project is not a static one; it stands as an evolving design for a Web 2.0 ePortfolio that seeks to refer to users needs, institutional and professional contexts and the development of software that can be incorporated within the design. What it potentially provides to creative artist is an opportunity to have a dialogue about art with artefacts of the artist products and processes in that discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation develops the model of a prototype system for the digital lodgement of spatial data sets with statutory bodies responsible for the registration and approval of land related actions under the Torrens Title system. Spatial data pertain to the location of geographical entities together with their spatial dimensions and are classified as point, line, area or surface. This dissertation deals with a sub-set of spatial data, land boundary data that result from the activities performed by surveying and mapping organisations for the development of land parcels. The prototype system has been developed, utilising an event-driven paradigm for the user-interface, to exploit the potential of digital spatial data being generated from the utilisation of electronic techniques. The system provides for the creation of a digital model of the cadastral network and dependent data sets for an area of interest from hard copy records. This initial model is calibrated on registered control and updated by field survey to produce an amended model. The field-calibrated model then is electronically validated to ensure it complies with standards of format and content. The prototype system was designed specifically to create a database of land boundary data for subsequent retrieval by land professionals for surveying, mapping and related activities. Data extracted from this database are utilised for subsequent field survey operations without the need to create an initial digital model of an area of interest. Statistical reporting of differences resulting when subsequent initial and calibrated models are compared, replaces the traditional checking operations of spatial data performed by a land registry office. Digital lodgement of survey data is fundamental to the creation of the database of accurate land boundary data. This creation of the database is fundamental also to the efficient integration of accurate spatial data about land being generated by modem technology such as global positioning systems, and remote sensing and imaging, with land boundary information and other information held in Government databases. The prototype system developed provides for the delivery of accurate, digital land boundary data for the land registration process to ensure the continued maintenance of the integrity of the cadastre. Such data should meet also the more general and encompassing requirements of, and prove to be of tangible, longer term benefit to the developing, electronic land information industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the contemporary issue of the control, restoration and potential for reuse of State Government-owned heritage properties with commercial potential. It attempts to reconcile the sometimes competing interests of the range of stakeholders in such properties, particularly those seeking to maximise economic performance and return on one hand and community expectations for heritage preservation and exhibition on the other. The matters are approached principally from the Government's position as asset owner/manager. It includes research into a number of key elements - including statutory, physical and economic parameters and an analysis of the legitimate requirements of all stakeholders. The thesis also recognises the need for innovation in approach and for the careful structuring and pre-planning of proposals on a project-by-project basis. On the matter of innovation, four case studies are included in the thesis to exhibit some approaches and techniques that have already been employed in addressing these issues. From this research base, a series of deductions at both a macro and micro level are established and a model for a rational decision-making process for dealing with such projects is developed as a major outcome of the work. Finally, the general model is applied to a specific project, the currently unused Port Office heritage site in the Brisbane Central Business District.