421 resultados para initialisation flaws


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The paper provides an assessment of the performance of commercial Real Time Kinematic (RTK) systems over longer than recommended inter-station distances. The experiments were set up to test and analyse solutions from the i-MAX, MAX and VRS systems being operated with three triangle shaped network cells, each having an average inter-station distance of 69km, 118km and 166km. The performance characteristics appraised included initialization success rate, initialization time, RTK position accuracy and availability, ambiguity resolution risk and RTK integrity risk in order to provide a wider perspective of the performance of the testing systems. ----- ----- The results showed that the performances of all network RTK solutions assessed were affected by the increase in the inter-station distances to similar degrees. The MAX solution achieved the highest initialization success rate of 96.6% on average, albeit with a longer initialisation time. Two VRS approaches achieved lower initialization success rate of 80% over the large triangle. In terms of RTK positioning accuracy after successful initialisation, the results indicated a good agreement between the actual error growth in both horizontal and vertical components and the accuracy specified in the RMS and part per million (ppm) values by the manufacturers. ----- ----- Additionally, the VRS approaches performed better than the MAX and i-MAX when being tested under the standard triangle network with a mean inter-station distance of 69km. However as the inter-station distance increases, the network RTK software may fail to generate VRS correction and then may turn to operate in the nearest single-base RTK (or RAW) mode. The position uncertainty reached beyond 2 meters occasionally, showing that the RTK rover software was using an incorrect ambiguity fixed solution to estimate the rover position rather than automatically dropping back to using an ambiguity float solution. Results identified that the risk of incorrectly resolving ambiguities reached 18%, 20%, 13% and 25% for i-MAX, MAX, Leica VRS and Trimble VRS respectively when operating over the large triangle network. Additionally, the Coordinate Quality indicator values given by the Leica GX1230 GG rover receiver tended to be over-optimistic and not functioning well with the identification of incorrectly fixed integer ambiguity solutions. In summary, this independent assessment has identified some problems and failures that can occur in all of the systems tested, especially when being pushed beyond the recommended limits. While such failures are expected, they can offer useful insights into where users should be wary and how manufacturers might improve their products. The results also demonstrate that integrity monitoring of RTK solutions is indeed necessary for precision applications, thus deserving serious attention from researchers and system providers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate known security flaws in the context of security ceremonies to gain an understanding of the ceremony analysis process. The term security ceremonies is used to describe a system of protocols and humans which interact for a specific purpose. Security ceremonies and ceremony analysis is an area of research in its infancy, and we explore the basic principles involved to better understand the issues involved.We analyse three ceremonies, HTTPS, EMV and Opera Mini, and use the information gained from the experience to establish a list of typical flaws in ceremonies. Finally, we use that list to analyse a protocol proven secure for human use. This leads to a realisation of the strengths and weaknesses of ceremony analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From their very outset, the disciplines of social science have claimed a need for interdisciplinarity. Proponents of new disciplines have also claimed the whole of human activity as their domain, whilst simultaneously emphasising the need for increased specialisation. Critical social analysis attempts to repair the flaws of specialisation. In this chapter, I argue that the trend towards academic specialisation in social science is most usefully viewed from the perspective of evaluative meaning, and that each new discipline, in emphasising one aspect of a broken conception of humanity, necessarily emphasises one aspect of an already broken conception of value. Critical discourse analysis, qua critical social analysis, may therefore benefit by firstly proceeding from the perspective of evaluative meaning to understand the dynamics of social change and overcome the challenges posed by centuries of intensive specialisation in social science.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this article is to present lessons learnt by nurses when conducting research to encourage colleagues to ask good clinical research questions. This is accomplished by presenting a study designed to challenge current practice which included research flaws. The longstanding practice of weighing renal patients at 0600 hours and then again prior to receiving haemodialysis was examined. Nurses believed that performing the assessment twice, often within a few hours, was unnecessary and that patients were angry when woken to be weighed. An observational study with convenience sampling collected data from 46 individuals requiring haemodialysis, who were repeatedly sampled to provide 139 episodes of data. Although the research hypotheses were rejected, invaluable experience was gained, with research and clinical practice lessons learnt, along with surprising findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The automated extraction of roads from aerial imagery can be of value for tasks including mapping, surveillance and change detection. Unfortunately, there are no public databases or standard evaluation protocols for evaluating these techniques. Many techniques are further hindered by a reliance on manual initialisation, making large scale application of the techniques impractical. In this paper, we present a public database and evaluation protocol for the evaluation of road extraction algorithms, and propose an improved automatic seed finding technique to initialise road extraction, based on a combination of geometric and colour features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various time-memory tradeoffs attacks for stream ciphers have been proposed over the years. However, the claimed success of these attacks assumes the initialisation process of the stream cipher is one-to-one. Some stream cipher proposals do not have a one-to-one initialisation process. In this paper, we examine the impact of this on the success of time-memory-data tradeoff attacks. Under the circumstances, some attacks are more successful than previously claimed while others are less. The conditions for both cases are established.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technologies and languages for integrated processes are a relatively recent innovation. Over that period many divergent waves of innovation have transformed process integration. Like sockets and distributed objects, early workflow systems ordered programming interfaces that connected the process modelling layer to any middleware. BPM systems emerged later, connecting the modelling world to middleware through components. While BPM systems increased ease of use (modelling convenience), long-standing and complex interactions involving many process instances remained di±cult to model. Enterprise Service Buses (ESBs), followed, connecting process models to heterogeneous forms of middleware. ESBs, however, generally forced modellers to choose a particular underlying middleware and to stick to it, despite their ability to connect with many forms of middleware. Furthermore ESBs encourage process integrations to be modelled on their own, logically separate from the process model. This can lead to the inability to reason about long standing conversations at the process layer. Technologies and languages for process integration generally lack formality. This has led to arbitrariness in the underlying language building blocks. Conceptual holes exist in a range of technologies and languages for process integration and this can lead to customer dissatisfaction and failure to bring integration projects to reach their potential. Standards for process integration share similar fundamental flaws to languages and technologies. Standards are also in direct competition with other standards causing a lack of clarity. Thus the area of greatest risk in a BPM project remains process integration, despite major advancements in the technology base. This research examines some fundamental aspects of communication middleware and how these fundamental building blocks of integration can be brought to the process modelling layer in a technology agnostic manner. This way process modelling can be conceptually complete without becoming stuck in a particular middleware technology. Coloured Petri nets are used to define a formal semantics for the fundamental aspects of communication middleware. They provide the means to define and model the dynamic aspects of various integration middleware. Process integration patterns are used as a tool to codify common problems to be solved. Object Role Modelling is a formal modelling technique that was used to define the syntax of a proposed process integration language. This thesis provides several contributions to the field of process integration. It proposes a framework defining the key notions of integration middleware. This framework provides a conceptual foundation upon which a process integration language could be built. The thesis defines an architecture that allows various forms of middleware to be aggregated and reasoned about at the process layer. This thesis provides a comprehensive set of process integration patterns. These constitute a benchmark for the kinds of problems a process integration language must support. The thesis proposes a process integration modelling language and a partial implementation that is able to enact the language. A process integration pilot project in a German hospital is brie°y described at the end of the thesis. The pilot is based on ideas in this thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant numbers of children are severely abused and neglected by parents and caregivers. Infants and very young children are the most vulnerable and are unable to seek help. To identify these situations and enable child protection and the provision of appropriate assistance, many jurisdictions have enacted ‘mandatory reporting laws’ requiring designated professionals such as doctors, nurses, police and teachers to report suspected cases of severe child abuse and neglect. Other jurisdictions have not adopted this legislative approach, at least partly motivated by a concern that the laws produce dramatic increases in unwarranted reports, which, it is argued, lead to investigations which infringe on people’s privacy, cause trauma to innocent parents and families, and divert scarce government resources from deserving cases. The primary purpose of this paper is to explore the extent to which opposition to mandatory reporting laws is valid based on the claim that the laws produce ‘overreporting’. The first part of this paper revisits the original mandatory reporting laws, discusses their development into various current forms, explains their relationship with policy and common law reporting obligations, and situates them in the context of their place in modern child protection systems. This part of the paper shows that in general, contemporary reporting laws have expanded far beyond their original conceptualisation, but that there is also now a deeper understanding of the nature, incidence, timing and effects of different types of severe maltreatment, an awareness that the real incidence of maltreatment is far higher than that officially recorded, and that there is strong evidence showing the majority of identified cases of severe maltreatment are the result of reports by mandated reporters. The second part of this paper discusses the apparent effect of mandatory reporting laws on ‘overreporting’ by referring to Australian government data about reporting patterns and outcomes, with a particular focus on New South Wales. It will be seen that raw descriptive data about report numbers and outcomes appear to show that reporting laws produce both desirable consequences (identification of severe cases) and problematic consequences (increased numbers of unsubstantiated reports). Yet, to explore the extent to which the data supports the overreporting claim, and because numbers of unsubstantiated reports alone cannot demonstrate overreporting, this part of the paper asks further questions of the data. Who makes reports, about which maltreatment types, and what are the outcomes of those reports? What is the nature of these reports; for example, to what extent are multiple numbers of reports made about the same child? What meaning can be attached to an ‘unsubstantiated’ report, and can such reports be used to show flaws in reporting effectiveness and problems in reporting laws? It will be suggested that available evidence from Australia is not sufficiently detailed or strong to demonstrate the overreporting claim. However, it is also apparent that, whether adopting an approach based on public health and or other principles, much better evidence about reporting needs to be collected and analyzed. As well, more nuanced research needs to be conducted to identify what can reasonably be said to constitute ‘overreports’, and efforts must be made to minimize unsatisfactory reporting practice, informed by the relevant jurisdiction’s context and aims. It is also concluded that, depending on the jurisdiction, the available data may provide useful indicators of positive, negative and unanticipated effects of specific components of the laws, and of the strengths, weaknesses and needs of the child protection system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Growth in productivity is the key determinant of the long-term health and prosperity of an economy. The construction industry being one of major strategic importance, its productivity performance has a significant effect on national economic growth. The relationship between construction output and economy has received intensive studies, but there is lack of empirical study on the relationship between construction productivity and economic fluctuations. Fluctuations in construction output are endemic in the industry. In part they are caused by the boom and slump of the economy as a whole and in part by the nature of the construction product. This research aims to uncover how the productivity of construction sector is influenced in the course of economic fluctuations in Malaysia. Malaysia has adopted three economic policies – New Economic Policy (1971-1990), National Development Policy (1991-2000) and the National Vision Policy (2001-2010) since gaining independence in 1959. The Privatisation Master Plan was introduced in 1991. Operating within this historical context, the Malaysian construction sector has experienced four business cycles since 1960. A mixed-method design approach is adopted in this study. Quantitative analysis was conducted on the published official statistics of the construction industry and the overall economy in Malaysia between 1970 and 2009. Qualitative study involved interviews with a purposive sample of 21 industrial participants. This study identified a 32-year long building cycle appears in 1975-2006. It is superimposed with three shorter construction business cycles in 1975-1987, 1987-1999 and 1999-2006. The correlations of Construction labour productivity (CLP) and GDP per capita are statistically significant for the 1975-2006 building cycle, 1987-1999 and 1999-2006 construction business cycles. It was not significant in 1975-1987 construction business cycles. The Construction Industry Surveys/Census over the period from 1996 to 2007 show that the average growth rate of total output per employee expanded but the added value per employee contracted which imply high cost of bought-in materials and services and inefficient usage of purchases. The construction labour productivity is peaked at 2004 although there is contraction of construction sector in 2004. The residential subsector performed relatively better than the other sub-sectors in most of the productivity indicators. Improvements are found in output per employee, value added per employee, labour competitiveness and capital investment but declines are recorded in value added content and capital productivity. The civil engineering construction is most productive in the labour productivity nevertheless relatively poorer in the capital productivity. The labour cost is more competitive in the larger size establishment. The added value per labour cost is higher in larger sized establishment attributed to efficient in utilization of capital. The interview with the industrial participant reveals that the productivity of the construction sector is influenced by the economic environment, the construction methods, contract arrangement, payment chain and regulatory policies. The fluctuations of construction demand have caused companies switched to defensive strategy during the economic downturn and to ensure short-term survival than to make a profit for the long-term survival and growth. It leads the company to take drastic measures to curb expenses, downsizing, employ contract employment, diversification and venture overseas market. There is no empirical evidence supports downsizing as a necessary step in a process of reviving productivity. The productivity does not correlate with size of firm. A relatively smaller and focused firm is more productive than the larger and diversified organisation. However diversified company experienced less fluctuation in both labour and capital productivity. In order to improve the productivity of the construction sector, it is necessary to remove the negatives and flaws from past practices. The recommended measures include long-term strategic planning and coordinated approaches of government agencies in planning of infrastructure development and to provide regulatory environments which encourage competition and facilitate productivity improvement.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite multiple efforts, the amount of poverty in Bangladesh has remained alarmingly high by any standard. Two salient characteristics of poverty alleviation in Bangladesh are: their poor accessibility for the ‘target’ population (the rural poor), and lack of co-ordination between government and the Non-Government Organisations. The moment the state alone is unable to combat poverty then the NGOs come into the picture to fill the void. First Britain as a colonial power, then the East Pakistan Government and the Government of Bangladesh have promulgated Ordinances and Regulations for the practical regulation of NGOs. The loopholes and flaws within the legal framework have given the NGOs opportunities to violate the Ordinances and Regulations. A better situation could be achieved by modifying and strictly implementing such state rules, ensuring accountability, effective state control, and meaningful NGO-State collaboration and co-operation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthony Downs public choice theory proposes that every rational person would try to meet their own desires in preference to those of others, and that such rational persons would attempt to obtain these desires in the most efficient manner possible. This paper will demonstrate that the application of this theory would mean that public servants and politicians would perform acts of corruption and maladministration in order to efficiently meet their desires. As such action is unavoidable, political parties must appear to meet the public demand for accountability systems, but must not make these systems viable lest they expose the corruption and maladministration that would threaten the government’s chance or re-election. It is therefore logical for governments to display a commitment for accountability whilst simultaneously ensuring the systems would not be able to interfere with government control or expose its flaws.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

U-Healthcare means that it provides healthcare services "at anytime and anywhere" using wired, wireless and ubiquitous sensor network technologies. As a main field of U-healthcare, Telehealth has been developed as an enhancement of Telemedicine. This system includes two-way interactive web-video communications, sensor technology, and health informatics. With these components, it will assist patients to receive their first initial diagnosis. Futhermore, Telehealth will help doctors diagnose patient's diseases at early stages and recommend treatments to patients. However, this system has a few limitations such as privacy issues, interruption of real-time service and a wrong ordering from remote diagnosis. To deal with those flaws, security procedures such as authorised access should be applied to as an indispensible component in medical environment. As a consequence, Telehealth system with these protection procedures in clinical services will cope with anticipated vulnerabilities of U-Healthcare services and security issues involved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Patients with chest pain contribute substantially to emergency department attendances, lengthy hospital stay, and inpatient admissions. A reliable, reproducible, and fast process to identify patients presenting with chest pain who have a low short-term risk of a major adverse cardiac event is needed to facilitate early discharge. We aimed to prospectively validate the safety of a predefined 2-h accelerated diagnostic protocol (ADP) to assess patients presenting to the emergency department with chest pain symptoms suggestive of acute coronary syndrome. Methods: This observational study was undertaken in 14 emergency departments in nine countries in the Asia-Pacific region, in patients aged 18 years and older with at least 5 min of chest pain. The ADP included use of a structured pre-test probability scoring method (Thrombolysis in Myocardial Infarction [TIMI] score), electrocardiograph, and point-of-care biomarker panel of troponin, creatine kinase MB, and myoglobin. The primary endpoint was major adverse cardiac events within 30 days after initial presentation (including initial hospital attendance). This trial is registered with the Australia-New Zealand Clinical Trials Registry, number ACTRN12609000283279. Findings: 3582 consecutive patients were recruited and completed 30-day follow-up. 421 (11•8%) patients had a major adverse cardiac event. The ADP classified 352 (9•8%) patients as low risk and potentially suitable for early discharge. A major adverse cardiac event occurred in three (0•9%) of these patients, giving the ADP a sensitivity of 99•3% (95% CI 97•9–99•8), a negative predictive value of 99•1% (97•3–99•8), and a specificity of 11•0% (10•0–12•2). Interpretation: This novel ADP identifies patients at very low risk of a short-term major adverse cardiac event who might be suitable for early discharge. Such an approach could be used to decrease the overall observation periods and admissions for chest pain. The components needed for the implementation of this strategy are widely available. The ADP has the potential to affect health-service delivery worldwide.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents an analysis of the stream cipher Mixer, a bit-based cipher with structural components similar to the well-known Grain cipher and the LILI family of keystream generators. Mixer uses a 128-bit key and 64-bit IV to initialise a 217-bit internal state. The analysis is focused on the initialisation function of Mixer and shows that there exist multiple key-IV pairs which, after initialisation, produce the same initial state, and consequently will generate the same keystream. Furthermore, if the number of iterations of the state update function performed during initialisation is increased, then the number of distinct initial states that can be obtained decreases. It is also shown that there exist some distinct initial states which produce the same keystream, resulting in a further reduction of the effective key space

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper examines the instances and motivations for noble cause corruption perpetrated by NSW police officers. Noble cause corruption occurs when a person tries to produce a just outcome through unjust methods, for example, police manipulating evidence to ensure a conviction of a known offender. Normal integrity regime initiatives are unlikely to halt noble cause corruption as its basis lies in an attempt to do good by compensating for the apparent flaws in an unjust system. This paper suggests that the solution lies in a change of culture through improved leadership and uses the political theories of Roger Myerson to propose a possible solution. Evidence from police officers in transcripts of the Wood Inquiry (1997) are examined to discern their participation in noble cause corruption and their rationalisation of this behaviour. The overall findings are that officers were motivated to indulge in this type of corruption through a desire to produce convictions where they felt the system unfairly worked against their ability to do their job correctly. We have added to the literature by demonstrating that the rewards can be positive. Police are seeking job satisfaction through the ability to convict the guilty. They will be able to do this through better equipment and investigative powers.