922 resultados para requirement for consent discontinuance


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Human survival depends on human ingenuity in using resources at hand to sustain human life. The historical record – in wrings and archaeological artefacts – provides evidence of the growth and collapse of political organisations and societies. In the institutions of Western civilisation, some traditions have endured over millennia where the roles of monarchs and public officials have been organised in perpetual succession. These roles were developed as conventions in the British Parliament after 1295 and provided the models of corporate governance in both public and private enterprise that have been continuously refined to the present day. In 2011, the Queensland Parliament legislated to introduce a new and more open system of scrutiny of legislation through a system of portfolio-based parliamentary committees. The committees began to function more actively in July 2012 and have inviting submissions from stakeholders and experts in a structured way to consider the government’s priorities in its legislative programme. The questions now are whether the Surveying and Spatial Sciences can respond expertly to address the terms of reference and meet the timetables of the various parliamentary committees. This paper discusses some of the more important and urgent issues that deserve debate that the profession needs to address in becoming more responsive to matters of public policy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years, cities have shown increasing signs of environmental problems due to the negative impacts of urban activities. The degradation and depletion of natural resources, climate change, and development pressure on green areas have become major concerns for cities. In response to these problems, urban planning policies have shifted to a sustainable focus and authorities have begun to develop new strategies for improving the quality of urban ecosystems. An extremely important function of an urban ecosystem is to provide healthy and sustainable environments for both natural systems and communities. Therefore, ecological planning is a functional requirement in the establishment of sustainable built environment. With ecological planning, human needs are supplied while natural resources are used in the most effective and sustainable manner and ecological balance is sustained. Protecting human and environmental health, having healthy ecosystems, reducing environmental pollution and providing green spaces are just a few of the many benefits of ecological planning. In this context, this chapter briefly presents a short overview of the importance of the implementation of ecological planning into sustainable urban development. Furthermore, it presents a conceptual framework for a new methodology for developing sustainable urban ecosystems through ecological planning approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Australia, Vocational Education and Training (VET) programs are delivered in a variety of settings. You can be enrolled within a course in a high school, at a technical institution, private training provider or at your place of employment. Recognition of prior learning, on the job training and industry partnerships are strong factors supporting the change of delivery. The curriculum content within these programs has also changed. For example within the Business Services programs, the prerequisite and corequisite skill of touch keyboarding to an Australian Standard has moved from a core requirement in the 1990’s to an elective requirement in the 2000’s. Where a base skill becomes an elective skill, how does this effect the performance and outcomes for the learner, educator, employer and society as a whole? This paper will explore these issues and investigate the current position of standards within the VET curriculum today.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider a joint relay selection and subcarrier allocation problem that minimizes the total system power for a multi-user, multi-relay and single source cooperative OFDM based two hop system. The system is constrained to all users having a specific subcarrier requirement (user fairness). However no specific fairness constraints for relays are considered. To ensure the optimum power allocation, the subcarriers in two hops are paired with each other. We obtain an optimal subcarrier allocation for the single user case using a similar method to what is described in [1] and modify the algorithm for multiuser scenario. Although the optimality is not achieved in multiuser case the probability of all users being served fairly is improved significantly with a relatively low cost trade off.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

EHealth systems promise enviable benefits and capabilities for healthcare. But, the technologies that make these capabilities possible brings with them undesirable drawback such as information security related threats which need to be appropriately addressed. Lurking in these threats are patient privacy concerns. Fulfilling these privacy concerns have proven to be difficult since they often conflict with information requirements of care providers. It is important to achieve a proper balance between these requirements. We believe that information accountability can achieve this balance. In this paper we introduce accountable-eHealth systems. We will discuss how our designed protocols can successfully address the aforementioned requirement. We will also compare characteristics of AeH systems with Australia’s PCEHR system and identify similarities and highlight the differences and the impact those differences would have to the eHealth domain.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Critically ill patients receiving extracorporeal membrane oxygenation (ECMO) are often noted to have increased sedation requirements. However, data related to sedation in this complex group of patients is limited. The aim of our study was to characterise the sedation requirements in adult patients receiving ECMO for cardiorespiratory failure. A retrospective chart review was performed to collect sedation data for 30 consecutive patients who received venovenous or venoarterial ECMO between April 2009 and March 2011. To test for a difference in doses over time we used a regression model. The dose of midazolam received on ECMO support increased by an average of 18 mg per day (95% confidence interval 8, 29 mg, P=0.001), while the dose of morphine increased by 29 mg per day (95% confidence interval 4, 53 mg, P=0.021) The venovenous group received a daily midazolam dose that was 157 mg higher than the venoarterial group (95% confidence interval 53, 261 mg, P=0.005). We did not observe any significant increase in fentanyl doses over time (95% confidence interval 1269, 4337 µg, P=0.94). There is a significant increase in dose requirement for morphine and midazolam during ECMO. Patients on venovenous ECMO received higher sedative doses as compared to patients on venoarterial ECMO. Future research should focus on mechanisms behind these changes and also identify drugs that are most suitable for sedation during ECMO.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recovery is a highly contextualized concept amid divergent interpretations and unique experiences. There is substantial current interest in building evidence about recovery from mental illness in order to inform best practice founded in the ways people find to live productive and meaningful lives. This paper presents some accounts related to recovery and illness expressed by eight people through a Participatory Action Research project. The research facilitated entry to the subjective experiences of living in the community as an artist with a mental illness. The people in the research shared an integrated understanding of illness, recovery and identity. Their understanding provided insight into mental illness as an inseparable aspect of who they were. Further, specific issue was raised of recovery as a clinical term with a requirement to meet distinct conventions of recovery. This paper emphasizes that being ill and being well, for the person with a mental illness, is a dynamic and complex development not easily explained or transformed into uniform process or outcomes. Attempts to establish an integral or consensual approach to recovery has, to date, disregarded mental illness as a full human experience. This paper argues that broader frameworks for thinking and responding to the dynamic processes of mental illness and recovery are needed and require acknowledgment of competing and contradictory ideas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Despite important implications for the budgets, statistical power and generalisability of research findings, detailed reports of recruitment and retention in randomised controlled trials (RCTs) are rare. The NOURISH RCT evaluated a community-based intervention for first-time mothers that promoted protective infant feeding practices as a primary prevention strategy for childhood obesity. The aim of this paper is to provide a detailed description and evaluation of the recruitment and retention strategies used. Methods: A two stage recruitment process designed to provide a consecutive sampling framework was used. First time mothers delivering healthy term infants were initially approached in postnatal wards of the major maternity services in two Australian cities for consent to later contact (Stage 1). When infants were about four months old mothers were re-contacted by mail for enrolment (Stage 2), baseline measurements (Time 1) and subsequent random allocation to the intervention or control condition. Outcomes were assessed at infant ages 14 months (Time 2) and 24 months (Time 3). Results: At Stage 1, 86% of eligible mothers were approached and of these women, 76% consented to later contact. At Stage 2, 3% had become ineligible and 76% could be recontacted. Of the latter, 44% consented to full enrolment and were allocated. This represented 21% of mothers screened as eligible at Stage 1. Retention at Time 3 was 78%. Mothers who did not consent or discontinued the study were younger and less likely to have a university education. Conclusions: The consent and retention rates of our sample of first time mothers are comparable with or better than other similar studies. The recruitment strategy used allowed for detailed information from non-consenters to be collected; thus selection bias could be estimated. Recommendations for future studies include being able to contact participants via mobile phone (particular text messaging), offering home visits to reduce participant burden and considering the use of financial incentives to support participant retention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Advanced substation applications, such as synchrophasors and IEC 61850-9-2 sampled value process buses, depend upon highly accurate synchronizing signals for correct operation. The IEEE 1588 Precision Timing Protocol (PTP) is the recommended means of providing precise timing for future substations. This paper presents a quantitative assessment of PTP reliability using Fault Tree Analysis. Two network topologies are proposed that use grandmaster clocks with dual network connections and take advantage of the Best Master Clock Algorithm (BMCA) from IEEE 1588. The cross-connected grandmaster topology doubles reliability, and the addition of a shared third grandmaster gives a nine-fold improvement over duplicated grandmasters. The performance of BMCA mediated handover of the grandmaster role during contingencies in the timing system was evaluated experimentally. The 1 µs performance requirement of sampled values and synchrophasors are met, even during network or GPS antenna outages. Slave clocks are shown to synchronize to the backup grandmaster in response to degraded performance or loss of the main grandmaster. Slave disturbances are less than 350 ns provided the grandmaster reference clocks are not offset from one another. A clear understanding of PTP reliability and the factors that affect availability will encourage the adoption of PTP for substation time synchronization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1993, contrary to the trend towards enterprise bargaining, and despite an employment environment favouring strong managerial prerogative, a small group of employers in the Queensland commercial health and fitness industry sought industrial regulation through an industry-specific award. A range of factors, including increased competition and unscrupulous profiteers damaging the industry’s reputation, triggered the actions as a business strategy. The strategic choices of the employer group, to approach a union to initiate a consent award, are the inverse of behaviours expected under strategic choice theory. This article argues that organizational size, collective employer action, focus on industry rather than organizational outcomes and the traditional industrial relations system providing broader impacts explain their atypical behaviour.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper present an efficient method using system state sampling technique in Monte Carlo simulation for reliability evaluation of multi-area power systems, at Hierarchical Level One (HLI). System state sampling is one of the common methods used in Monte Carlo simulation. The cpu time and memory requirement can be a problem, using this method. Combination of analytical and Monte Carlo method known as Hybrid method, as presented in this paper, can enhance the efficiency of the solution. Incorporation of load model in this study can be utilised either by sampling or enumeration. Both cases are examined in this paper, by application of the methods on Roy Billinton Test System(RBTS).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability to estimate the asset reliability and the probability of failure is critical to reducing maintenance costs, operation downtime, and safety hazards. Predicting the survival time and the probability of failure in future time is an indispensable requirement in prognostics and asset health management. In traditional reliability models, the lifetime of an asset is estimated using failure event data, alone; however, statistically sufficient failure event data are often difficult to attain in real-life situations due to poor data management, effective preventive maintenance, and the small population of identical assets in use. Condition indicators and operating environment indicators are two types of covariate data that are normally obtained in addition to failure event and suspended data. These data contain significant information about the state and health of an asset. Condition indicators reflect the level of degradation of assets while operating environment indicators accelerate or decelerate the lifetime of assets. When these data are available, an alternative approach to the traditional reliability analysis is the modelling of condition indicators and operating environment indicators and their failure-generating mechanisms using a covariate-based hazard model. The literature review indicates that a number of covariate-based hazard models have been developed. All of these existing covariate-based hazard models were developed based on the principle theory of the Proportional Hazard Model (PHM). However, most of these models have not attracted much attention in the field of machinery prognostics. Moreover, due to the prominence of PHM, attempts at developing alternative models, to some extent, have been stifled, although a number of alternative models to PHM have been suggested. The existing covariate-based hazard models neglect to fully utilise three types of asset health information (including failure event data (i.e. observed and/or suspended), condition data, and operating environment data) into a model to have more effective hazard and reliability predictions. In addition, current research shows that condition indicators and operating environment indicators have different characteristics and they are non-homogeneous covariate data. Condition indicators act as response variables (or dependent variables) whereas operating environment indicators act as explanatory variables (or independent variables). However, these non-homogenous covariate data were modelled in the same way for hazard prediction in the existing covariate-based hazard models. The related and yet more imperative question is how both of these indicators should be effectively modelled and integrated into the covariate-based hazard model. This work presents a new approach for addressing the aforementioned challenges. The new covariate-based hazard model, which termed as Explicit Hazard Model (EHM), explicitly and effectively incorporates all three available asset health information into the modelling of hazard and reliability predictions and also drives the relationship between actual asset health and condition measurements as well as operating environment measurements. The theoretical development of the model and its parameter estimation method are demonstrated in this work. EHM assumes that the baseline hazard is a function of the both time and condition indicators. Condition indicators provide information about the health condition of an asset; therefore they update and reform the baseline hazard of EHM according to the health state of asset at given time t. Some examples of condition indicators are the vibration of rotating machinery, the level of metal particles in engine oil analysis, and wear in a component, to name but a few. Operating environment indicators in this model are failure accelerators and/or decelerators that are included in the covariate function of EHM and may increase or decrease the value of the hazard from the baseline hazard. These indicators caused by the environment in which an asset operates, and that have not been explicitly identified by the condition indicators (e.g. Loads, environmental stresses, and other dynamically changing environment factors). While the effects of operating environment indicators could be nought in EHM; condition indicators could emerge because these indicators are observed and measured as long as an asset is operational and survived. EHM has several advantages over the existing covariate-based hazard models. One is this model utilises three different sources of asset health data (i.e. population characteristics, condition indicators, and operating environment indicators) to effectively predict hazard and reliability. Another is that EHM explicitly investigates the relationship between condition and operating environment indicators associated with the hazard of an asset. Furthermore, the proportionality assumption, which most of the covariate-based hazard models suffer from it, does not exist in EHM. According to the sample size of failure/suspension times, EHM is extended into two forms: semi-parametric and non-parametric. The semi-parametric EHM assumes a specified lifetime distribution (i.e. Weibull distribution) in the form of the baseline hazard. However, for more industry applications, due to sparse failure event data of assets, the analysis of such data often involves complex distributional shapes about which little is known. Therefore, to avoid the restrictive assumption of the semi-parametric EHM about assuming a specified lifetime distribution for failure event histories, the non-parametric EHM, which is a distribution free model, has been developed. The development of EHM into two forms is another merit of the model. A case study was conducted using laboratory experiment data to validate the practicality of the both semi-parametric and non-parametric EHMs. The performance of the newly-developed models is appraised using the comparison amongst the estimated results of these models and the other existing covariate-based hazard models. The comparison results demonstrated that both the semi-parametric and non-parametric EHMs outperform the existing covariate-based hazard models. Future research directions regarding to the new parameter estimation method in the case of time-dependent effects of covariates and missing data, application of EHM in both repairable and non-repairable systems using field data, and a decision support model in which linked to the estimated reliability results, are also identified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a graph-based method to weight medical concepts in documents for the purposes of information retrieval. Medical concepts are extracted from free-text documents using a state-of-the-art technique that maps n-grams to concepts from the SNOMED CT medical ontology. In our graph-based concept representation, concepts are vertices in a graph built from a document, edges represent associations between concepts. This representation naturally captures dependencies between concepts, an important requirement for interpreting medical text, and a feature lacking in bag-of-words representations. We apply existing graph-based term weighting methods to weight medical concepts. Using concepts rather than terms addresses vocabulary mismatch as well as encapsulates terms belonging to a single medical entity into a single concept. In addition, we further extend previous graph-based approaches by injecting domain knowledge that estimates the importance of a concept within the global medical domain. Retrieval experiments on the TREC Medical Records collection show our method outperforms both term and concept baselines. More generally, this work provides a means of integrating background knowledge contained in medical ontologies into data-driven information retrieval approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The convergence of corporate social responsibility (CSR) and corporate governance has immense impact on the participants in global supply chains. The global buyers and retailers tend to incorporate CSR in all stages of product manufacturing within their supply chains. The incorporated CSR thus creates the difficulty to small- and medium-sized manufacturing enterprises (SMEs). Incompetence in standardized CSR practices is an important issue that causes SMEs either losing their scope to access global market directly or serving as subcontractors to large enterprises. This article explores this issue by focusing on Bangladeshi SMEs under the CSR requirement of the important global buyer.