915 resultados para Data replication processes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this reflection on research processes a humanities researcher begins to ask questions about the cultural materialist dimensions of research activities. At the center of this exploration are questions relating to the ways in which personal histories and experiences inform particular research processes and the ways in which a researcher's habits of collecting and working with data are regulated by cultural and social practice. The reflection on personal research processes is located in terms of the ethics work of Michel Foucault that provides reminders about the role of modern bureaucracy in governing what appear to be personal processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assurance of learning is a predominant feature in both quality enhancement and assurance in higher education. Assurance of learning is a process that articulates explicit program outcomes and standards, and systematically gathers evidence to determine the extent to which performance matches expectations. Benefits accrue to the institution through the systematic assessment of whole of program goals. Data may be used for continuous improvement, program development, and to inform external accreditation and evaluation bodies. Recent developments, including the introduction of the Tertiary Education and Quality Standards Agency (TEQSA) will require universities to review the methods they use to assure learning outcomes. This project investigates two critical elements of assurance of learning: 1. the mapping of graduate attributes throughout a program; and 2. the collection of assurance of learning data. An audit was conducted with 25 of the 39 Business Schools in Australian universities to identify current methods of mapping graduate attributes and for collecting assurance of learning data across degree programs, as well as a review of the key challenges faced in these areas. Our findings indicate that external drivers like professional body accreditation (for example: Association to Advance Collegiate Schools of Business (AACSB)) and TEQSA are important motivators for assuring learning, and those who were undertaking AACSB accreditation had more robust assurance of learning systems in place. It was reassuring to see that the majority of institutions (96%) had adopted an embedding approach to assuring learning rather than opting for independent standardised testing. The main challenges that were evident were the development of sustainable processes that were not considered a burden to academic staff, and obtainment of academic buy in to the benefits of assuring learning per se rather than assurance of learning being seen as a tick box exercise. This cultural change is the real challenge in assurance of learning practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Boundaries are an important field of study because they mediate almost every aspect of organizational life. They are becoming increasingly more important as organizations change more frequently and yet, despite the endemic use of the boundary metaphor in common organizational parlance, they are poorly understood. Organizational boundaries are under-theorized and researchers in related fields often simply assume their existence, without defining them. The literature on organizational boundaries is fragmented with no unifying theoretical basis. As a result, when it is recognized that an organizational boundary is "dysfunctional". there is little recourse to models on which to base remediating action. This research sets out to develop just such a theoretical model and is guided by the general question: "What is the nature of organizational boundaries?" It is argued that organizational boundaries can be conceptualised through elements of both social structure and of social process. Elements of structure include objects, coupling, properties and identity. Social processes include objectification, identification, interaction and emergence. All of these elements are integrated by a core category, or basic social process, called boundary weaving. An organizational boundary is a complex system of objects and emergent properties that are woven together by people as they interact together, objectifying the world around them, identifying with these objects and creating couplings of varying strength and polarity as well as their own fragmented identity. Organizational boundaries are characterised by the multiplicity of interconnections, a particular domain of objects, varying levels of embodiment and patterns of interaction. The theory developed in this research emerged from an exploratory, qualitative research design employing grounded theory methodology. The field data was collected from the training headquarters of the New Zealand Army using semi-structured interviews and follow up observations. The unit of analysis is an organizational boundary. Only one research context was used because of the richness and multiplicity of organizational boundaries that were present. The model arose, grounded in the data collected, through a process of theoretical memoing and constant comparative analysis. Academic literature was used as a source of data to aid theory development and the saturation of some central categories. The final theory is classified as middle range, being substantive rather than formal, and is generalizable across medium to large organizations in low-context societies. The main limitation of the research arose from the breadth of the research with multiple lines of inquiry spanning several academic disciplines, with some relevant areas such as the role of identity and complexity being addressed at a necessarily high level. The organizational boundary theory developed by this research replaces the typology approaches, typical of previous theory on organizational boundaries and reconceptualises the nature of groups in organizations as well as the role of "boundary spanners". It also has implications for any theory that relies on the concept of boundaries, such as general systems theory. The main contribution of this research is the development of a holistic model of organizational boundaries including an explanation of the multiplicity of boundaries . no organization has a single definable boundary. A significant aspect of this contribution is the integration of aspects of complexity theory and identity theory to explain the emergence of higher-order properties of organizational boundaries and of organizational identity. The core category of "boundary weaving". is a powerful new metaphor that significantly reconceptualises the way organizational boundaries may be understood in organizations. It invokes secondary metaphors such as the weaving of an organization's "boundary fabric". and provides managers with other metaphorical perspectives, such as the management of boundary friction, boundary tension, boundary permeability and boundary stability. Opportunities for future research reside in formalising and testing the theory as well as developing analytical tools that would enable managers in organizations to apply the theory in practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modelling an environmental process involves creating a model structure and parameterising the model with appropriate values to accurately represent the process. Determining accurate parameter values for environmental systems can be challenging. Existing methods for parameter estimation typically make assumptions regarding the form of the Likelihood, and will often ignore any uncertainty around estimated values. This can be problematic, however, particularly in complex problems where Likelihoods may be intractable. In this paper we demonstrate an Approximate Bayesian Computational method for the estimation of parameters of a stochastic CA. We use as an example a CA constructed to simulate a range expansion such as might occur after a biological invasion, making parameter estimates using only count data such as could be gathered from field observations. We demonstrate ABC is a highly useful method for parameter estimation, with accurate estimates of parameters that are important for the management of invasive species such as the intrinsic rate of increase and the point in a landscape where a species has invaded. We also show that the method is capable of estimating the probability of long distance dispersal, a characteristic of biological invasions that is very influential in determining spread rates but has until now proved difficult to estimate accurately.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Muscle physiologists often describe fatigue simply as a decline of muscle force and infer this causes an athlete to slow down. In contrast, exercise scientists describe fatigue during sport competition more holistically as an exercise-induced impairment of performance. The aim of this review is to reconcile the different views by evaluating the many performance symptoms/measures and mechanisms of fatigue. We describe how fatigue is assessed with muscle, exercise or competition performance measures. Muscle performance (single muscle test measures) declines due to peripheral fatigue (reduced muscle cell force) and/or central fatigue (reduced motor drive from the CNS). Peak muscle force seldom falls by >30% during sport but is often exacerbated during electrical stimulation and laboratory exercise tasks. Exercise performance (whole-body exercise test measures) reveals impaired physical/technical abilities and subjective fatigue sensations. Exercise intensity is initially sustained by recruitment of new motor units and help from synergistic muscles before it declines. Technique/motor skill execution deviates as exercise proceeds to maintain outcomes before they deteriorate, e.g. reduced accuracy or velocity. The sensation of fatigue incorporates an elevated rating of perceived exertion (RPE) during submaximal tasks, due to a combination of peripheral and higher CNS inputs. Competition performance (sport symptoms) is affected more by decision-making and psychological aspects, since there are opponents and a greater importance on the result. Laboratory based decision making is generally faster or unimpaired. Motivation, self-efficacy and anxiety can change during exercise to modify RPE and, hence, alter physical performance. Symptoms of fatigue during racing, team-game or racquet sports are largely anecdotal, but sometimes assessed with time-motion analysis. Fatigue during brief all-out racing is described biomechanically as a decline of peak velocity, along with altered kinematic components. Longer sport events involve pacing strategies, central and peripheral fatigue contributions and elevated RPE. During match play, the work rate can decline late in a match (or tournament) and/or transiently after intense exercise bursts. Repeated sprint ability, agility and leg strength become slightly impaired. Technique outcomes, such as velocity and accuracy for throwing, passing, hitting and kicking, can deteriorate. Physical and subjective changes are both less severe in real rather than simulated sport activities. Little objective evidence exists to support exercise-induced mental lapses during sport. A model depicting mind-body interactions during sport competition shows that the RPE centre-motor cortex-working muscle sequence drives overall performance levels and, hence, fatigue symptoms. The sporting outputs from this sequence can be modulated by interactions with muscle afferent and circulatory feedback, psychological and decision-making inputs. Importantly, compensatory processes exist at many levels to protect against performance decrements. Small changes of putative fatigue factors can also be protective. We show that individual fatigue factors including diminished carbohydrate availability, elevated serotonin, hypoxia, acidosis, hyperkalaemia, hyperthermia, dehydration and reactive oxygen species, each contribute to several fatigue symptoms. Thus, multiple symptoms of fatigue can occur simultaneously and the underlying mechanisms overlap and interact. Based on this understanding, we reinforce the proposal that fatigue is best described globally as an exercise-induced decline of performance as this is inclusive of all viewpoints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Road asset managers are overwhelmed with a high volume of raw data which they need to process and utilise in supporting their decision making. This paper presents a method that processes road-crash data of a whole road network and exposes hidden value inherent in the data by deploying the clustering data mining method. The goal of the method is to partition the road network into a set of groups (classes) based on common data and characterise the class crash types to produce a crash profiles for each cluster. By comparing similar road classes with differing crash types and rates, insight can be gained into these differences that are caused by the particular characteristics of their roads. These differences can be used as evidence in knowledge development and decision support.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mandatory data breach notification laws are a novel and potentially important legal instrument regarding organisational protection of personal information. These laws require organisations that have suffered a data breach involving personal information to notify those persons that may be affected, and potentially government authorities, about the breach. The Australian Law Reform Commission (ALRC) has proposed the creation of a mandatory data breach notification scheme, implemented via amendments to the Privacy Act 1988 (Cth). However, the conceptual differences between data breach notification law and information privacy law are such that it is questionable whether a data breach notification scheme can be solely implemented via an information privacy law. Accordingly, this thesis by publications investigated, through six journal articles, the extent to which data breach notification law was conceptually and operationally compatible with information privacy law. The assessment of compatibility began with the identification of key issues related to data breach notification law. The first article, Stakeholder Perspectives Regarding the Mandatory Notification of Australian Data Breaches started this stage of the research which concluded in the second article, The Mandatory Notification of Data Breaches: Issues Arising for Australian and EU Legal Developments (‘Mandatory Notification‘). A key issue that emerged was whether data breach notification was itself an information privacy issue. This notion guided the remaining research and focused attention towards the next stage of research, an examination of the conceptual and operational foundations of both laws. The second article, Mandatory Notification and the third article, Encryption Safe Harbours and Data Breach Notification Laws did so from the perspective of data breach notification law. The fourth article, The Conceptual Basis of Personal Information in Australian Privacy Law and the fifth article, Privacy Invasive Geo-Mashups: Privacy 2.0 and the Limits of First Generation Information Privacy Laws did so for information privacy law. The final article, Contextualizing the Tensions and Weaknesses of Information Privacy and Data Breach Notification Laws synthesised previous research findings within the framework of contextualisation, principally developed by Nissenbaum. The examination of conceptual and operational foundations revealed tensions between both laws and shared weaknesses within both laws. First, the distinction between sectoral and comprehensive information privacy legal regimes was important as it shaped the development of US data breach notification laws and their subsequent implementable scope in other jurisdictions. Second, the sectoral versus comprehensive distinction produced different emphases in relation to data breach notification thus leading to different forms of remedy. The prime example is the distinction between market-based initiatives found in US data breach notification laws compared to rights-based protections found in the EU and Australia. Third, both laws are predicated on the regulation of personal information exchange processes even though both laws regulate this process from different perspectives, namely, a context independent or context dependent approach. Fourth, both laws have limited notions of harm that is further constrained by restrictive accountability frameworks. The findings of the research suggest that data breach notification is more compatible with information privacy law in some respects than others. Apparent compatibilities clearly exist as both laws have an interest in the protection of personal information. However, this thesis revealed that ostensible similarities are founded on some significant differences. Data breach notification law is either a comprehensive facet to a sectoral approach or a sectoral adjunct to a comprehensive regime. However, whilst there are fundamental differences between both laws they are not so great to make them incompatible with each other. The similarities between both laws are sufficient to forge compatibilities but it is likely that the distinctions between them will produce anomalies particularly if both laws are applied from a perspective that negates contextualisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The health system is one sector dealing with a deluge of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Also, there are many healthcare organisations, which still have stand-alone systems, not integrated for management of information and decision-making. This shows, there is a need for an effective system to capture, collate and distribute this health data. Therefore, implementing the data warehouse concept in healthcare is potentially one of the solutions to integrate health data. Data warehousing has been used to support business intelligence and decision-making in many other sectors such as the engineering, defence and retail sectors. The research problem that is going to be addressed is, "how can data warehousing assist the decision-making process in healthcare". To address this problem the researcher has narrowed an investigation focusing on a cardiac surgery unit. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. The cardiac surgery unit at TPCH uses a stand-alone database of patient clinical data, which supports clinical audit, service management and research functions. However, much of the time, the interaction between the cardiac surgery unit information system with other units is minimal. There is a limited and basic two-way interaction with other clinical and administrative databases at TPCH which support decision-making processes. The aims of this research are to investigate what decision-making issues are faced by the healthcare professionals with the current information systems and how decision-making might be improved within this healthcare setting by implementing an aligned data warehouse model or models. As a part of the research the researcher will propose and develop a suitable data warehouse prototype based on the cardiac surgery unit needs and integrating the Intensive Care Unit database, Clinical Costing unit database (Transition II) and Quality and Safety unit database [electronic discharge summary (e-DS)]. The goal is to improve the current decision-making processes. The main objectives of this research are to improve access to integrated clinical and financial data, providing potentially better information for decision-making for both improved from the questionnaire and by referring to the literature, the results indicate a centralised data warehouse model for the cardiac surgery unit at this stage. A centralised data warehouse model addresses current needs and can also be upgraded to an enterprise wide warehouse model or federated data warehouse model as discussed in the many consulted publications. The data warehouse prototype was able to be developed using SAS enterprise data integration studio 4.2 and the data was analysed using SAS enterprise edition 4.3. In the final stage, the data warehouse prototype was evaluated by collecting feedback from the end users. This was achieved by using output created from the data warehouse prototype as examples of the data desired and possible in a data warehouse environment. According to the feedback collected from the end users, implementation of a data warehouse was seen to be a useful tool to inform management options, provide a more complete representation of factors related to a decision scenario and potentially reduce information product development time. However, there are many constraints exist in this research. For example the technical issues such as data incompatibilities, integration of the cardiac surgery database and e-DS database servers and also, Queensland Health information restrictions (Queensland Health information related policies, patient data confidentiality and ethics requirements), limited availability of support from IT technical staff and time restrictions. These factors have influenced the process for the warehouse model development, necessitating an incremental approach. This highlights the presence of many practical barriers to data warehousing and integration at the clinical service level. Limitations included the use of a small convenience sample of survey respondents, and a single site case report study design. As mentioned previously, the proposed data warehouse is a prototype and was developed using only four database repositories. Despite this constraint, the research demonstrates that by implementing a data warehouse at the service level, decision-making is supported and data quality issues related to access and availability can be reduced, providing many benefits. Output reports produced from the data warehouse prototype demonstrated usefulness for the improvement of decision-making in the management of clinical services, and quality and safety monitoring for better clinical care. However, in the future, the centralised model selected can be upgraded to an enterprise wide architecture by integrating with additional hospital units’ databases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several authors stress the importance of data’s crucial foundation for operational, tactical and strategic decisions (e.g., Redman 1998, Tee et al. 2007). Data provides the basis for decision making as data collection and processing is typically associated with reducing uncertainty in order to make more effective decisions (Daft and Lengel 1986). While the first series of investments of Information Systems/Information Technology (IS/IT) into organizations improved data collection, restricted computational capacity and limited processing power created challenges (Simon 1960). Fifty years on, capacity and processing problems are increasingly less relevant; in fact, the opposite exists. Determining data relevance and usefulness is complicated by increased data capture and storage capacity, as well as continual improvements in information processing capability. As the IT landscape changes, businesses are inundated with ever-increasing volumes of data from both internal and external sources available on both an ad-hoc and real-time basis. More data, however, does not necessarily translate into more effective and efficient organizations, nor does it increase the likelihood of better or timelier decisions. This raises questions about what data managers require to assist their decision making processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flow-oriented process modeling languages have a long tradition in the area of Business Process Management and are widely used for capturing activities with their behavioral and data dependencies. Individual events were introduced for triggering process instantiation and activities. However, real-world business cases drive the need for also covering complex event patterns as they are known in the field of Complex Event Processing. Therefore, this paper puts forward a catalog of requirements for handling complex events in process models, which can be used as reference framework for assessing process definition languages and systems. An assessment of BPEL and BPMN is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cities accumulate and distribute vast sets of digital information. Many decision-making and planning processes in councils, local governments and organisations are based on both real-time and historical data. Until recently, only a small, carefully selected subset of this information has been released to the public – usually for specific purposes (e.g. train timetables, release of planning application through websites to name just a few). This situation is however changing rapidly. Regulatory frameworks, such as the Freedom of Information Legislation in the US, the UK, the European Union and many other countries guarantee public access to data held by the state. One of the results of this legislation and changing attitudes towards open data has been the widespread release of public information as part of recent Government 2.0 initiatives. This includes the creation of public data catalogues such as data.gov.au (U.S.), data.gov.uk (U.K.), data.gov.au (Australia) at federal government levels, and datasf.org (San Francisco) and data.london.gov.uk (London) at municipal levels. The release of this data has opened up the possibility of a wide range of future applications and services which are now the subject of intensified research efforts. Previous research endeavours have explored the creation of specialised tools to aid decision-making by urban citizens, councils and other stakeholders (Calabrese, Kloeckl & Ratti, 2008; Paulos, Honicky & Hooker, 2009). While these initiatives represent an important step towards open data, they too often result in mere collections of data repositories. Proprietary database formats and the lack of an open application programming interface (API) limit the full potential achievable by allowing these data sets to be cross-queried. Our research, presented in this paper, looks beyond the pure release of data. It is concerned with three essential questions: First, how can data from different sources be integrated into a consistent framework and made accessible? Second, how can ordinary citizens be supported in easily composing data from different sources in order to address their specific problems? Third, what are interfaces that make it easy for citizens to interact with data in an urban environment? How can data be accessed and collected?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study shows an alternative solution to existing efforts at solving the problem of how to centrally manage and synchronise users’ Multiple Profiles (MP) across multiple discrete social networks. Most social network users hold more than one social network account and utilise them in different ways depending on the digital context (Iannella, 2009a). They may, for example, enjoy friendly chat on Facebook1, professional discussion on LinkedIn2, and health information exchange on PatientsLikeMe3 In this thesis the researcher proposes a framework for the management of a user’s multiple online social network profiles. A demonstrator, called Multiple Profile Manager (MPM), will be showcased to illustrate how effective the framework will be. The MPM will achieve the required profile management and synchronisation using a free, open, decentralized social networking platform (OSW) that was proposed by the Vodafone Group in 2010. The proposed MPM will enable a user to create and manage an integrated profile (IP) and share/synchronise this profile with all their social networks. The necessary protocols to support the prototype are also proposed by the researcher. The MPM protocol specification defines an Extensible Messaging and Presence Protocol (XMPP) extension for sharing vCard and social network accounts information between the MPM Server, MPM Client, and social network sites (SNSs). . Therefore many web users need to manage disparate profiles across many distributed online sources. Maintaining these profiles is cumbersome, time-consuming, inefficient, and may lead to lost opportunity. The writer of this thesis adopted a research approach and a number of use cases for the implementation of the project. The use cases were created to capture the functional requirements of the MPM and to describe the interactions between users and the MPM. In the research a development process was followed in establishing the prototype and related protocols. The use cases were subsequently used to illustrate the prototype via the screenshots taken of the MPM client interfaces. The use cases also played a role in evaluating the outcomes of the research such as the framework, prototype, and the related protocols. An innovative application of this project is in the area of public health informatics. The researcher utilised the prototype to examine how the framework might benefit patients and physicians. The framework can greatly enhance health information management for patients and more importantly offer a more comprehensive personal health overview of patients to physicians. This will give a more complete picture of the patient’s background than is currently available and will prove helpful in providing the right treatment. The MPM prototype and related protocols have a high application value as they can be integrated into the real OSW platform and so serve users in the modern digital world. They also provide online users with a real platform for centrally storing their complete profile data, efficiently managing their personal information, and moreover, synchronising the overall complete profile with each of their discrete profiles stored in their different social network sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean processes are complex and have high variability in both time and space. Thus, ocean scientists must collect data over long time periods to obtain a synoptic view of ocean processes and resolve their spatiotemporal variability. One way to perform these persistent observations is to utilise an autonomous vehicle that can remain on deployment for long time periods. However, such vehicles are generally underactuated and slow moving. A challenge for persistent monitoring with these vehicles is dealing with currents while executing a prescribed path or mission. Here we present a path planning method for persistent monitoring that exploits ocean currents to increase navigational accuracy and reduce energy consumption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most approaches to business process compliance are restricted to the analysis of the structure of processes. It has been argued that full regulatory compliance requires information on not only the structure of processes but also on what the tasks in a process do. To this end Governatori and Sadiq[2007] proposed to extend business processes with semantic annotations. We propose a methodology to automatically extract one kind of such annotations; in particular the annotations related to the data schema and templates linked to the various tasks in a business process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quadrupole coupling constants (qcc) for39K and23Na ions in glycerol have been calculated from linewidths measured as a function of temperature (which in turn results in changes in solution viscosity). The qcc of39K in glycerol is found to be 1.7 MHz, and that of23Na is 1.6 MHz. The relaxation behavior of39K and23Na ions in glycerol shows magnetic field and temperature dependence consistent with the equations for transverse relaxation more commonly used to describe the reorientation of nuclei in a molecular framework with intramolecular field gradients. It is shown, however, that τc is not simply proportional to the ratio of viscosity/temperature (ηT). The 39K qcc in glycerol and the value of 1.3 MHz estimated for this nucleus in aqueous solution are much greater than values of 0.075 to 0.12 MHz calculated from T2 measurements of39K in freshly excised rat tissues. This indicates that, in biological samples, processes such as exchange of potassium between intracellular compartments or diffusion of ions through locally ordered regions play a significant role in determining the effective quadrupole coupling constant and correlation time governing39K relaxation. T1 and T2 measurements of rat muscle at two magnetic fields also indicate that a more complex correlation function may be required to describe the relaxation of39K in tissue. Similar results and conclusions are found for23Na.