184 resultados para specification


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant issue encountered when fusing data received from multiple sensors is the accuracy of the timestamp associated with each piece of data. This is particularly important in applications such as Simultaneous Localisation and Mapping (SLAM) where vehicle velocity forms an important part of the mapping algorithms; on fastmoving vehicles, even millisecond inconsistencies in data timestamping can produce errors which need to be compensated for. The timestamping problem is compounded in a robot swarm environment due to the use of non-deterministic readily-available hardware (such as 802.11-based wireless) and inaccurate clock synchronisation protocols (such as Network Time Protocol (NTP)). As a result, the synchronisation of the clocks between robots can be out by tens-to-hundreds of milliseconds making correlation of data difficult and preventing the possibility of the units performing synchronised actions such as triggering cameras or intricate swarm manoeuvres. In this thesis, a complete data fusion unit is designed, implemented and tested. The unit, named BabelFuse, is able to accept sensor data from a number of low-speed communication buses (such as RS232, RS485 and CAN Bus) and also timestamp events that occur on General Purpose Input/Output (GPIO) pins referencing a submillisecondaccurate wirelessly-distributed "global" clock signal. In addition to its timestamping capabilities, it can also be used to trigger an attached camera at a predefined start time and frame rate. This functionality enables the creation of a wirelessly-synchronised distributed image acquisition system over a large geographic area; a real world application for this functionality is the creation of a platform to facilitate wirelessly-distributed 3D stereoscopic vision. A ‘best-practice’ design methodology is adopted within the project to ensure the final system operates according to its requirements. Initially, requirements are generated from which a high-level architecture is distilled. This architecture is then converted into a hardware specification and low-level design, which is then manufactured. The manufactured hardware is then verified to ensure it operates as designed and firmware and Linux Operating System (OS) drivers are written to provide the features and connectivity required of the system. Finally, integration testing is performed to ensure the unit functions as per its requirements. The BabelFuse System comprises of a single Grand Master unit which is responsible for maintaining the absolute value of the "global" clock. Slave nodes then determine their local clock o.set from that of the Grand Master via synchronisation events which occur multiple times per-second. The mechanism used for synchronising the clocks between the boards wirelessly makes use of specific hardware and a firmware protocol based on elements of the IEEE-1588 Precision Time Protocol (PTP). With the key requirement of the system being submillisecond-accurate clock synchronisation (as a basis for timestamping and camera triggering), automated testing is carried out to monitor the o.sets between each Slave and the Grand Master over time. A common strobe pulse is also sent to each unit for timestamping; the correlation between the timestamps of the di.erent units is used to validate the clock o.set results. Analysis of the automated test results show that the BabelFuse units are almost threemagnitudes more accurate than their requirement; clocks of the Slave and Grand Master units do not di.er by more than three microseconds over a running time of six hours and the mean clock o.set of Slaves to the Grand Master is less-than one microsecond. The common strobe pulse used to verify the clock o.set data yields a positive result with a maximum variation between units of less-than two microseconds and a mean value of less-than one microsecond. The camera triggering functionality is verified by connecting the trigger pulse output of each board to a four-channel digital oscilloscope and setting each unit to output a 100Hz periodic pulse with a common start time. The resulting waveform shows a maximum variation between the rising-edges of the pulses of approximately 39¥ìs, well below its target of 1ms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The suitability of Role Based Access Control (RBAC) is being challenged in dynamic environments like healthcare. In an RBAC system, a user's legitimate access may be denied if their need has not been anticipated by the security administrator at the time of policy specification. Alternatively, even when the policy is correctly specified an authorised user may accidentally or intentionally misuse the granted permission. The heart of the challenge is the intrinsic unpredictability of users' operational needs as well as their incentives to misuse permissions. In this paper we propose a novel Budget-aware Role Based Access Control (B-RBAC) model that extends RBAC with the explicit notion of budget and cost, where users are assigned a limited budget through which they pay for the cost of permissions they need. We propose a model where the value of resources are explicitly defined and an RBAC policy is used as a reference point to discriminate the price of access permissions, as opposed to representing hard and fast rules for making access decisions. This approach has several desirable properties. It enables users to acquire unassigned permissions if they deem them necessary. However, users misuse capability is always bounded by their allocated budget and is further adjustable through the discrimination of permission prices. Finally, it provides a uniform mechanism for the detection and prevention of misuses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Implementation Guide for the Hospital Surveillance of SAB has been produced by the Healthcare Associated Infection (HAI) Technical Working Group of the Australian Commission on Safety and Quality in Health Care (ACSQHC), and endorsed by the HAI Advisory Group. The Technical Working Group is made up of representatives invited from surveillance units and the ACSQHC, who have had input into the preparation of this Guide. The Guide has been developed to ensure consistency in reporting of SAB across public and private hospitals to enable accurate national reporting and benchmarking. It is intended to be used by Australian hospitals and organisations to support the implementation of healthcare associated Staphylococcus aureus bacteraemia(SAB) surveillance using the endorsed case definition1 in the box below and further detail in the Data Set Specification.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There has been a recent surge of interest in cooking skills in a diverse range of fields, such as health, education and public policy. There appears to be an assumption that cooking skills are in decline and that this is having an adverse impact on individual health and well-being, and family wholesomeness. The problematisation of cooking skills is not new, and can be seen in a number of historical developments that have specified particular pedagogies about food and eating. The purpose of this paper is to examine pedagogies on cooking skills and the importance accorded them. The paper draws on Foucault’s work on governmentality. By using examples from the USA, UK and Australia, the paper demonstrates the ways that authoritative discourses on the know how and the know what about food and cooking – called here ‘savoir fare’ – are developed and promulgated. These discourses, and the moral panics in which they are embedded, require individuals to make choices about what to cook and how to cook, and in doing so establish moral pedagogies concerning good and bad cooking. The development of food literacy programmes, which see cooking skills as life skills, further extends the obligations to ‘cook properly’ to wider populations. The emphasis on cooking knowledge and skills has ushered in new forms of government, firstly, through a relationship between expertise and politics which is readily visible through the authority that underpins the need to develop skills in food provisioning and preparation; secondly, through a new pluralisation of ‘social’ technologies which invites a range of private-public interest through, for example, television cooking programmes featuring cooking skills, albeit it set in a particular milieu of entertainment; and lastly, through a new specification of the subject can be seen in the formation of a choosing subject, one which has to problematise food choice in relation to expert advice and guidance. A governmentality focus shows that as discourses develop about what is the correct level of ‘savoir fare’, new discursive subject positions are opened up. Armed with the understanding of what is considered expert-endorsed acceptable food knowledge, subjects judge themselves through self-surveillance. The result is a powerful food and family morality that is both disciplined and disciplinary.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Educators are faced with many challenging questions in designing an effective curriculum. What prerequisite knowledge do students have before commencing a new subject? At what level of mastery? What is the spread of capabilities between bare-passing students vs. the top performing group? How does the intended learning specification compare to student performance at the end of a subject? In this paper we present a conceptual model that helps in answering some of these questions. It has the following main capabilities: capturing the learning specification in terms of syllabus topics and outcomes; capturing mastery levels to model progression; capturing the minimal vs. aspirational learning design; capturing confidence and reliability metrics for each of these mappings; and finally, comparing and reflecting on the learning specification against actual student performance. We present a web-based implementation of the model, and validate it by mapping the final exams from four programming subjects against the ACM/IEEE CS2013 topics and outcomes, using Bloom's Taxonomy as the mastery scale. We then import the itemised exam grades from 632 students across the four subjects and compare the demonstrated student performance against the expected learning for each of these. Key contributions of this work are the validated conceptual model for capturing and comparing expected learning vs. demonstrated performance, and a web-based implementation of this model, which is made freely available online as a community resource.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The well-known difficulties students exhibit when learning to program are often characterised as either difficulties in understanding the problem to be solved or difficulties in devising and coding a computational solution. It would therefore be helpful to understand which of these gives students the greatest trouble. Unit testing is a mainstay of large-scale software development and maintenance. A unit test suite serves not only for acceptance testing, but is also a form of requirements specification, as exemplified by agile programming methodologies in which the tests are developed before the corresponding program code. In order to better understand students’ conceptual difficulties with programming, we conducted a series of experiments in which students were required to write both unit tests and program code for non-trivial problems. Their code and tests were then assessed separately for correctness and ‘coverage’, respectively. The results allowed us to directly compare students’ abilities to characterise a computational problem, as a unit test suite, and develop a corresponding solution, as executable code. Since understanding a problem is a pre-requisite to solving it, we expected students’ unit testing skills to be a strong predictor of their ability to successfully implement the corresponding program. Instead, however, we found that students’testing abilities lag well behind their coding skills.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This note examines the productive efficiency of 62 starting guards during the 2011/12 National Basketball Association (NBA) season. This period coincides with the phenomenal and largely unanticipated performance of New York Knicks’ starting point guard Jeremy Lin and the attendant public and media hype known as Linsanity. We employ a data envelopment analysis (DEA) approach that includes allowance for an undesirable output, here turnovers per game, with the desirable outputs of points, rebounds, assists, steals and blocks per game and an input of minutes per game. The results indicate that depending upon the specification, between 29% and 42% of NBA guards are fully efficient, including Jeremy Lin, with a mean inefficiency of 3.7% and 19.2%. However, while Jeremy Lin is technically efficient, he seldom serves as a benchmark for inefficient players, at least when compared with established players such as Chris Paul and Dwayne Wade. This suggests the uniqueness of Jeremy Lin's productive solution and may explain why his unique style of play, encompassing individual brilliance, unselfish play and team leadership, is of such broad public appeal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Immigration has played an important role in the historical development of Australia. Thus, it is no surprise that a large body of empirical work has developed, which focuses upon how migrants fare in the land of opportunity. Much of the literature is comparatively recent, i.e. the last ten years or so, encouraged by the advent of public availability of Australian crosssection micro data. Several different aspects of migrant welfare have been addressed, with major emphasis being placed upon earnings and unemployment experience. For recent examples see Haig (1980), Stromback (1984), Chiswick and Miller (1985), Tran-Nam and Nevile (1988) and Beggs and Chapman (1988). The present paper contributes to the literature by providing additional empirical evidence on the native/migrant earnings differential. The data utilised are from the rather neglected Australian Bureau of Statistics, ABS Special Supplementary Survey No.4. 1982, otherwise known as the Family Survey. The paper also examines the importance of distinguishing between the wage and salary sector and the self-employment sector when discussing native/migrant differentials. Separate earnings equations for the two labour market groups are estimated and the native/migrant earnings differential is broken down by employment status. This is a novel application in the Australian context and provides some insight into the earnings of the selfemployed, a group that despite its size (around 20 per cent of the labour force) is frequently ignored by economic research. Most previous empirical research fails to examine the effect of employment status on earnings. Stromback (1984) includes a dummy variable representing self-employment status in an earnings equation estimated over a pooled sample of paid and self-employed workers. The variable is found to be highly significant, which leads Stromback to question the efficacy of including the self-employed in the estimation sample. The suggestion is that part of self-employed earnings represent a return to non-human capital investment, i.e. investments in machinery, buildings etc, the structural determinants of earnings differ significantly from those for paid employees. Tran-Nam and Nevile (1988) deal with differences between paid employees and the selfemployed by deleting the latter from their sample. However, deleting the self-employed from the estimation sample may lead to bias in the OLS estimation method (see Heckman 1979). The desirable properties of OLS are dependent upon estimation on a random sample. Thus, the 'Ran-Nam and Nevile results are likely to suffer from bias unless individuals are randomly allocated between self-employment and paid employment. The current analysis extends Tran-Nam and Nevile (1988) by explicitly treating the choice of paid employment versus self-employment as being endogenously determined. This allows an explicit test for the appropriateness of deleting self-employed workers from the sample. Earnings equations that are corrected for sample selection are estimated for both natives and migrants in the paid employee sector. The Heckman (1979) two-step estimator is employed. The paper is divided into five major sections. The next section presents the econometric model incorporating the specification of the earnings generating process together with an explicit model determining an individual's employment status. In Section 111 the data are described. Section IV draws together the main econometric results of the paper. First, the probit estimates of the labour market status equation are documented. This is followed by presentation and discussion of the Heckman two-stage estimates of the earnings specification for both native and migrant Australians. Separate earnings equations are estimated for paid employees and the self-employed. Section V documents estimates of the nativelmigrant earnings differential for both categories of employees. To aid comparison with earlier work, the Oaxaca decomposition of the earnings differential for paid-employees is carried out for both the simple OLS regression results as well as the parameter estimates corrected for sample selection effects. These differentials are interpreted and compared with previous Australian findings. A short section concludes the paper.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The success of contemporary organizations depends on their ability to make appropriate decisions. Making appropriate decisions is inevitably bound to the availability and provision of relevant information. Information systems should be able to provide information in an efficient way. Thus, within information systems development a detailed analysis of information supply and information demands has to prevail. Based on Syperski’s information set and subset-model we will give an epistemological foundation of information modeling in general and show, why conceptual modeling in particular is capable of specifying effective and efficient information systems. Furthermore, we derive conceptual modeling requirements based on our findings. A short example illustrates the usefulness of a conceptual data modeling technique for the specification of information systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Supply chain management and customer relationship management are concepts for optimizing the provision of goods to customers. Information sharing and information estimation are key tools used to implement these two concepts. The reduction of delivery times and stock levels can be seen as the main managerial objectives of an integrative supply chain and customer relationship management. To achieve this objective, business processes need to be integrated along the entire supply chain including the end consumer. Information systems form the backbone of any business process integration. The relevant information system architectures are generally well-understood, but the conceptual specification of information systems for business process integration from a management perspective, remains an open methodological problem. To address this problem, we will show how customer relationship management and supply chain management information can be integrated at the conceptual level in order to provide supply chain managers with relevant information. We will further outline how the conceptual management perspective of business process integration can be supported by deriving specifications for enabling information system from business objectives.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Data warehouse projects, today, are in an ambivalent situation. On the one hand, data warehouses are critical for a company’s success and various methodological and technological tools are sophisticatedly developed to implement them. On the other hand, a significant amount of data warehouse projects fails due to non-technical reasons such as insufficient management support or in-corporative employees. But management support and user participation can be increased dramatically with specification methods that are understandable to these user groups. This paper aims at overcoming possible non-technical failure reasons by introducing a user-adequate specification approach within the field of management information systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally the fire resistance rating of LSF wall systems is based on approximate prescriptive methods developed using limited fire tests. Therefore a detailed research study into the performance of load bearing LSF wall systems under standard fire conditions was undertaken to develop improved fire design rules. It used the extensive fire performance results of eight different LSF wall systems from a series of full scale fire tests and numerical studies for this purpose. The use of previous fire design rules developed for LSF walls subjected to non-uniform elevated temperature distributions based on AISI design manual and Eurocode3 Parts 1.2 and 1.3 was investigated first. New simplified fire design rules based on AS/NZS 4600, North American Specification and Eurocode 3 Part 1.3 were then proposed in this study with suitable allowances for the interaction effects of compression and bending actions. The importance of considering thermal bowing, magnified thermal bowing and neutral axis shift in the fire design was also investigated. A spread sheet based design tool was developed based on the new design rules to predict the failure load ratio versus time and temperature curves for varying LSF wall configurations. The accuracy of the proposed design rules was verified using the test and FEA results for different wall configurations, steel grades, thicknesses and load ratios. This paper presents the details and results of this study including the improved fire design rules for predicting the load capacity of LSF wall studs and the failure times of LSF walls under standard fire conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper has presented the details of an investigation into the flexural and flexuraltorsional buckling behaviour of cold-formed structural steel columns with pinned and fixed ends. Current design rules for the member capacities of cold-formed steel columns are based on the same non-dimensional strength curve for both fixed and pinned-ended columns. This research has reviewed the accuracy of the current design rules in AS/NZS 4600 and the North American Specification in determining the member capacities of cold-formed steel columns using the results from detailed finite element analyses and an experimental study of lipped channel columns. It was found that the current Australian and American design rules accurately predicted the member capacities of pin ended lipped channel columns undergoing flexural and flexural torsional buckling. However, for fixed ended columns with warping fixity undergoing flexural-torsional buckling, it was found that the current design rules significantly underestimated the column capacities as they disregard the beneficial effect of warping fixity. This paper has therefore proposed improved design rules and verified their accuracy using finite element analysis and test results of cold-formed lipped channel columns made of three cross-sections and five different steel grades and thicknesses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Web-based social networking applications have become increasingly important in recent years. The current applications in the healthcare sphere can support the health management, but to date there is no patient-controlled integrator. This paper proposes a platform called Multiple Profile Manager (MPM) that enables a user to create and manage an integrated profile that can be shared across numerous social network sites. Moreover, it is able to facilitate the collection of personal healthcare data, which makes a contribution to the development of public health informatics. Here we want to illustrate how patients and physicians can be benefited from enabling the platform for online social network sites. The MPM simplifies the management of patients' profiles and allows health professionals to obtain a more complete picture of the patients' background so that they can provide better health care. To do so, we demonstrate a prototype of the platform and describe its protocol specification, which is an XMPP (Extensible Messaging and Presence Protocol) [1] extension, for sharing and synchronising profile data (vCard²) between different social networks.