253 resultados para Tibetan coded character set extension B


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent Supreme Court decision of Queensland v B [2008] 2 Qd R 562 has significant implications for the law that governs consent and abortions. The judgment purports to extend the ratio of Secretary, Department of Health and Community Services (NT) v JWB and SMB (1991) 175 CLR 218 (Marion’s Case) and impose a requirement of court approval for terminations of pregnancy for minors who are not Gillick-competent. This article argues against the imposition of this requirement on the ground that such an approach is an unjustifiable extension of the reasoning in Marion’s Case. The decision, which is the first judicial consideration in Queensland of the position of medical terminations, also reveals systemic problems with the criminal law in that State. In concluding that the traditional legal excuse for abortions will not apply to those which are performed medically, Queensland v B provides further support for calls to reform this area of law.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Building Information Modelling (BIM) is an information technology [IT] enabled approach to managing design data in the AEC/FM (Architecture, Engineering and Construction/ Facilities Management) industry. BIM enables improved interdisciplinary collaboration across distributed teams, intelligent documentation and information retrieval, greater consistency in building data, better conflict detection and enhanced facilities management. Despite the apparent benefits the adoption of BIM in practice has been slow. Workshops with industry focus groups were conducted to identify the industry needs, concerns and expectations from participants who had implemented BIM or were BIM “ready”. Factors inhibiting BIM adoption include lack of training, low business incentives, perception of lack of rewards, technological concerns, industry fragmentation related to uneven ICT adoption practices, contractual matters and resistance to changing current work practice. Successful BIM usage depends on collective adoption of BIM across the different disciplines and support by the client. The relationship of current work practices to future BIM scenarios was identified as an important strategy as the participants believed that BIM cannot be efficiently used with traditional practices and methods. The key to successful implementation is to explore the extent to which current work practices must change. Currently there is a perception that all work practices and processes must adopt and change for effective usage of BIM. It is acknowledged that new roles and responsibilities are emerging and that different parties will lead BIM on different projects. A contingency based approach to the problem of implementation was taken which relies upon integration of BIM project champion, procurement strategy, team capability analysis, commercial software availability/applicability and phase decision making and event analysis. Organizations need to understand: (a) their own work processes and requirements; (b) the range of BIM applications available in the market and their capabilities (c) the potential benefits of different BIM applications and their roles in different phases of the project lifecycle, and (d) collective supply chain adoption capabilities. A framework is proposed to support organizations selection of BIM usage strategies that meet their project requirements. Case studies are being conducted to develop the framework. The results of the preliminary design management case study is presented for contractor led BIM specific to the design and construct procurement strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For a number of years now it has been evident that the major issue facing science educators in the more developed countries of the world is the quantitative decline in enrolments in the senior secondary sciences, particularly the physical sciences, and in the number of higher achieving students applying for places in universities to undertake further studies in science. The deep malaise in school science to which these quantitative measures point has been elucidated by more qualitative studies of the students’ experience of studying science in secondary school in several of these countries (Sweden, Lindahl (2003); England, Simon and Osborne (2002); and Australia, Lyons (2005)). Remarkably concordant descriptions of these experiences can be summarized as: School science is: • transmission of knowledge from the teacher or the textbook to the students. • about content that is irrelevant and boring to our lives. • difficult to learn in comparison with other subjects Incidentally, the Australian study only involved consistently high achieving students; but even so, most of them found science more difficult than other more interesting subjects, and concluded that further science studies should be avoided unless they were needed for some career purpose. Other more representative confirmations of negative evaluations of the science curricula across Australia (and in particular states) are now available in Australia, from the large scale reviews of Goodrum, Hackling and Rennie (2001) and from the TIMSS (2002). The former reported that well under half of secondary students find the science at school relevant to my future, useful ion everyday life, deals with things I am concerned with and helps me make decisions about my health.. TIMSS found that 62 and 65 % of females and males in Year 4 agree with I like learning science, but by Year 8 only 26 and 33 % still agree. Students in Japan have been doubly notably because of (a) their high performance in international measures of science achievement like TIMSS and PISA and (b) their very low response to items in these studies which relate to interest in science. Ogura (2003) reported an intra-national study of students across Years 6-9 (upper primary through Junior High); interest in a range of their subjects (including science) that make up that country’s national curriculum. There was a steady decline in interest in all these subjects which might have indicated an adolescent reaction against schooling generally. However, this study went on to ask the students a further question that is very meaningful in the Japanese context, If you discount the importance of this subject for university entrance, is it worth studying? Science and mathematics remained in decline while all the other subjects were seen more positively. It is thus ironic, at a time when some innovations in curriculum and other research-based findings are suggesting ways that these failures of school science might be corrected, to find school science under a new demands that come from quite outside science education, and which certainly do not have the correction of this malaise as a priority. The positive curricular and research findings can be characterized as moves from within science education, whereas the new demands are moves that come from without science education. In this paper I set out these two rather contrary challenges to the teaching of science as it is currently practised, and go on to suggest a way forward that could fruitfully combine the two.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In public venues, crowd size is a key indicator of crowd safety and stability. Crowding levels can be detected using holistic image features, however this requires a large amount of training data to capture the wide variations in crowd distribution. If a crowd counting algorithm is to be deployed across a large number of cameras, such a large and burdensome training requirement is far from ideal. In this paper we propose an approach that uses local features to count the number of people in each foreground blob segment, so that the total crowd estimate is the sum of the group sizes. This results in an approach that is scalable to crowd volumes not seen in the training data, and can be trained on a very small data set. As a local approach is used, the proposed algorithm can easily be used to estimate crowd density throughout different regions of the scene and be used in a multi-camera environment. A unique localised approach to ground truth annotation reduces the required training data is also presented, as a localised approach to crowd counting has different training requirements to a holistic one. Testing on a large pedestrian database compares the proposed technique to existing holistic techniques and demonstrates improved accuracy, and superior performance when test conditions are unseen in the training set, or a minimal training set is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Machine downtime, whether planned or unplanned, is intuitively costly to manufacturing organisations, but is often very difficult to quantify. The available literature showed that costing processes are rarely undertaken within manufacturing organisations. Where cost analyses have been undertaken, they generally have only valued a small proportion of the affected costs, leading to an overly conservative estimate. This thesis aimed to develop a cost of downtime model, with particular emphasis on the application of the model to Australia Post’s Flat Mail Optical Character Reader (FMOCR). The costing analysis determined a cost of downtime of $5,700,000 per annum, or an average cost of $138 per operational hour. The second section of this work focused on the use of the cost of downtime to objectively determine areas of opportunity for cost reduction on the FMOCR. This was the first time within Post that maintenance costs were considered along side of downtime for determining machine performance. Because of this, the results of the analysis revealed areas which have historically not been targeted for cost reduction. Further exploratory work was undertaken on the Flats Lift Module (FLM) and Auto Induction Station (AIS) Deceleration Belts through the comparison of the results against two additional FMOCR analysis programs. This research has demonstrated the development of a methodical and quantifiable cost of downtime for the FMOCR. This has been the first time that Post has endeavoured to examine the cost of downtime. It is also one of the very few methodologies for valuing downtime costs that has been proposed in literature. The work undertaken has also demonstrated how the cost of downtime can be incorporated into machine performance analysis with specific application to identifying high costs modules. The outcome of this report has both been the methodology for costing downtime, as well as a list of areas for cost reduction. In doing so, this thesis has outlined the two key deliverables presented at the outset of the research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To determine opinions and experiences of health professionals concerning the management of people with comorbid substance misuse and mental health disorders. Method: We conducted a survey of staff from mental health services and alcohol and drug services across Queensland. Survey items on problems and potential solutions had been generated by focus groups. Results: We analysed responses from 112 staff of alcohol and drug services and 380 mental health staff, representing a return of 79% and 42% respectively of the distributed surveys. One or more issues presented a substantial clinical management problem for 98% of respondents. Needs for increased facilities or services for dual disorder clients figured prominently. These included accommodation or respite care, work and rehabilitation programs, and support groups and resource materials for families. Needs for adolescent dual diagnosis services and after-hours alcohol and drug consultations were also reported. Each of these issues raised substantial problems for over 70% of staff. Another set of problems involved coordination of client care across mental health and alcohol and drug services, including disputes over duty of care. Difficulties with intersectoral liaison were more pronounced for alcohol and drug staff than for mental health. A majority of survey respondents identified 13 solutions as practical. These included routine screening for dual diagnosis at intake, and a range of proposals for closer intersectoral communication such as exchanging client information, developing shared treatment plans, conducting joint case conferences and offering consultation facilities. Conclusions: A wide range of problems for the management of comorbid disorders were identified. While solution of some problems will require resource allocation, many may be addressed by closer liaison between existing services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A continuing challenge for pre-service teacher education is the learning transfer between the university based components and the practical school based components of their training. It is not clear how easily pre-service teachers can transfer university learnings into ‘in school’ practice. Similarly, it is not clear how easily knowledge learned in the school context can be disembedded from this particular context and understood more generally by the pre-service teacher. This paper examines the effect of a community of practice formed specifically to explore learning transfer via collaboration and professional enquiry, in ‘real time’, across the globe. “Activity Theory” (Engestrom, 1999) provided the theoretical framework through which the cognitive, physical and social processes involved could be understood. For the study, three activity systems formed community of practice network. The first activity system involved pre-service teachers at a large university in Queensland, Australia. The second activity system was introduced by the pre-service teachers and involved Year 12 students and teachers at a private secondary school also in Queensland, Australia. The third activity system involved university staff engineers at a large university in Pennsylvania, USA. The common object among the three activity systems was to explore the principles and applications of nanotechnology. The participants in the two Queensland activity systems, controlled laboratory equipment (a high powered Atomic Force Microscope – CPII) in Pennsylvania, USA, with the aim of investigating surface topography and the properties of nano particles. The pre-service teachers were to develop their remote ‘real time’ experience into school classroom tasks, implement these tasks, and later report their findings to other pre-service teachers in the university activity system. As an extension to the project, the pre-service teachers were invited to co-author papers relating to the project. Data were collected from (a) reflective journals; (b) participant field notes – a pre-service teacher initiative; (c) surveys – a pre-service teacher initiative; (d) lesson reflections and digital recordings – a pre-service teacher initiative; and (e) interviews with participants. The findings are reported in terms of the major themes: boundary crossing, the philosophy of teaching, and professional relationships The findings have implications for teacher education. The researchers feel that deliberate planning for networking between activity systems may well be a solution to the apparent theory/practice gap. Proximity of activity systems need not be a hindering issue.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oberon-2 is an object-oriented language with a class structure based on type extension. The runtime structure of Oberon-2 is described and the low-level mechanism for dynamic type checking explained. It is shown that the superior type-safety of the language, when used for programming styles based on heterogeneous, pointer-linked data structures, has an entirely negligible cost in runtime performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A set of five tasks was designed to examine dynamic aspects of visual attention: selective attention to color, selective attention to pattern, dividing and switching attention between color and pattern, and selective attention to pattern with changing target. These varieties of visual attention were examined using the same set of stimuli under different instruction sets; thus differences between tasks cannot be attributed to differences in the perceptual features of the stimuli. ERP data are presented for each of these tasks. A within-task analysis of different stimulus types varying in similarity to the attended target feature revealed that an early frontal selection positivity (FSP) was evident in selective attention tasks, regardless of whether color was the attended feature. The scalp distribution of a later posterior selection negativity (SN) was affected by whether the attended feature was color or pattern. The SN was largely unaffected by dividing attention across color and pattern. A large widespread positivity was evident in most conditions, consisting of at least three subcomponents which were differentially affected by the attention conditions. These findings are discussed in relation to prior research and the time course of visual attention processes in the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis investigates the problem of robot navigation using only landmark bearings. The proposed system allows a robot to move to a ground target location specified by the sensor values observed at this ground target posi- tion. The control actions are computed based on the difference between the current landmark bearings and the target landmark bearings. No Cartesian coordinates with respect to the ground are computed by the control system. The robot navigates using solely information from the bearing sensor space. Most existing robot navigation systems require a ground frame (2D Cartesian coordinate system) in order to navigate from a ground point A to a ground point B. The commonly used sensors such as laser range scanner, sonar, infrared, and vision do not directly provide the 2D ground coordi- nates of the robot. The existing systems use the sensor measurements to localise the robot with respect to a map, a set of 2D coordinates of the objects of interest. It is more natural to navigate between the points in the sensor space corresponding to A and B without requiring the Cartesian map and the localisation process. Research on animals has revealed how insects are able to exploit very limited computational and memory resources to successfully navigate to a desired destination without computing Cartesian positions. For example, a honeybee balances the left and right optical flows to navigate in a nar- row corridor. Unlike many other ants, Cataglyphis bicolor does not secrete pheromone trails in order to find its way home but instead uses the sun as a compass to keep track of its home direction vector. The home vector can be inaccurate, so the ant also uses landmark recognition. More precisely, it takes snapshots and compass headings of some landmarks. To return home, the ant tries to line up the landmarks exactly as they were before it started wandering. This thesis introduces a navigation method based on reflex actions in sensor space. The sensor vector is made of the bearings of some landmarks, and the reflex action is a gradient descent with respect to the distance in sensor space between the current sensor vector and the target sensor vec- tor. Our theoretical analysis shows that except for some fully characterized pathological cases, any point is reachable from any other point by reflex action in the bearing sensor space provided the environment contains three landmarks and is free of obstacles. The trajectories of a robot using reflex navigation, like other image- based visual control strategies, do not correspond necessarily to the shortest paths on the ground, because the sensor error is minimized, not the moving distance on the ground. However, we show that the use of a sequence of waypoints in sensor space can address this problem. In order to identify relevant waypoints, we train a Self Organising Map (SOM) from a set of observations uniformly distributed with respect to the ground. This SOM provides a sense of location to the robot, and allows a form of path planning in sensor space. The navigation proposed system is analysed theoretically, and evaluated both in simulation and with experiments on a real robot.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An information filtering (IF) system monitors an incoming document stream to find the documents that match the information needs specified by the user profiles. To learn to use the user profiles effectively is one of the most challenging tasks when developing an IF system. With the document selection criteria better defined based on the users’ needs, filtering large streams of information can be more efficient and effective. To learn the user profiles, term-based approaches have been widely used in the IF community because of their simplicity and directness. Term-based approaches are relatively well established. However, these approaches have problems when dealing with polysemy and synonymy, which often lead to an information overload problem. Recently, pattern-based approaches (or Pattern Taxonomy Models (PTM) [160]) have been proposed for IF by the data mining community. These approaches are better at capturing sematic information and have shown encouraging results for improving the effectiveness of the IF system. On the other hand, pattern discovery from large data streams is not computationally efficient. Also, these approaches had to deal with low frequency pattern issues. The measures used by the data mining technique (for example, “support” and “confidences”) to learn the profile have turned out to be not suitable for filtering. They can lead to a mismatch problem. This thesis uses the rough set-based reasoning (term-based) and pattern mining approach as a unified framework for information filtering to overcome the aforementioned problems. This system consists of two stages - topic filtering and pattern mining stages. The topic filtering stage is intended to minimize information overloading by filtering out the most likely irrelevant information based on the user profiles. A novel user-profiles learning method and a theoretical model of the threshold setting have been developed by using rough set decision theory. The second stage (pattern mining) aims at solving the problem of the information mismatch. This stage is precision-oriented. A new document-ranking function has been derived by exploiting the patterns in the pattern taxonomy. The most likely relevant documents were assigned higher scores by the ranking function. Because there is a relatively small amount of documents left after the first stage, the computational cost is markedly reduced; at the same time, pattern discoveries yield more accurate results. The overall performance of the system was improved significantly. The new two-stage information filtering model has been evaluated by extensive experiments. Tests were based on the well-known IR bench-marking processes, using the latest version of the Reuters dataset, namely, the Reuters Corpus Volume 1 (RCV1). The performance of the new two-stage model was compared with both the term-based and data mining-based IF models. The results demonstrate that the proposed information filtering system outperforms significantly the other IF systems, such as the traditional Rocchio IF model, the state-of-the-art term-based models, including the BM25, Support Vector Machines (SVM), and Pattern Taxonomy Model (PTM).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Node-based Local Mesh Generation (NLMG) algorithm, which is free of mesh inconsistency, is one of core algorithms in the Node-based Local Finite Element Method (NLFEM) to achieve the seamless link between mesh generation and stiffness matrix calculation, and the seamless link helps to improve the parallel efficiency of FEM. Furthermore, the key to ensure the efficiency and reliability of NLMG is to determine the candidate satellite-node set of a central node quickly and accurately. This paper develops a Fast Local Search Method based on Uniform Bucket (FLSMUB) and a Fast Local Search Method based on Multilayer Bucket (FLSMMB), and applies them successfully to the decisive problems, i.e. presenting the candidate satellite-node set of any central node in NLMG algorithm. Using FLSMUB or FLSMMB, the NLMG algorithm becomes a practical tool to reduce the parallel computation cost of FEM. Parallel numerical experiments validate that either FLSMUB or FLSMMB is fast, reliable and efficient for their suitable problems and that they are especially effective for computing the large-scale parallel problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To summarise the extent to which narrative text fields in administrative health data are used to gather information about the event resulting in presentation to a health care provider for treatment of an injury, and to highlight best practise approaches to conducting narrative text interrogation for injury surveillance purposes.----- Design: Systematic review----- Data sources: Electronic databases searched included CINAHL, Google Scholar, Medline, Proquest, PubMed and PubMed Central.. Snowballing strategies were employed by searching the bibliographies of retrieved references to identify relevant associated articles.----- Selection criteria: Papers were selected if the study used a health-related database and if the study objectives were to a) use text field to identify injury cases or use text fields to extract additional information on injury circumstances not available from coded data or b) use text fields to assess accuracy of coded data fields for injury-related cases or c) describe methods/approaches for extracting injury information from text fields.----- Methods: The papers identified through the search were independently screened by two authors for inclusion, resulting in 41 papers selected for review. Due to heterogeneity between studies metaanalysis was not performed.----- Results: The majority of papers reviewed focused on describing injury epidemiology trends using coded data and text fields to supplement coded data (28 papers), with these studies demonstrating the value of text data for providing more specific information beyond what had been coded to enable case selection or provide circumstantial information. Caveats were expressed in terms of the consistency and completeness of recording of text information resulting in underestimates when using these data. Four coding validation papers were reviewed with these studies showing the utility of text data for validating and checking the accuracy of coded data. Seven studies (9 papers) described methods for interrogating injury text fields for systematic extraction of information, with a combination of manual and semi-automated methods used to refine and develop algorithms for extraction and classification of coded data from text. Quality assurance approaches to assessing the robustness of the methods for extracting text data was only discussed in 8 of the epidemiology papers, and 1 of the coding validation papers. All of the text interrogation methodology papers described systematic approaches to ensuring the quality of the approach.----- Conclusions: Manual review and coding approaches, text search methods, and statistical tools have been utilised to extract data from narrative text and translate it into useable, detailed injury event information. These techniques can and have been applied to administrative datasets to identify specific injury types and add value to previously coded injury datasets. Only a few studies thoroughly described the methods which were used for text mining and less than half of the studies which were reviewed used/described quality assurance methods for ensuring the robustness of the approach. New techniques utilising semi-automated computerised approaches and Bayesian/clustering statistical methods offer the potential to further develop and standardise the analysis of narrative text for injury surveillance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Services in the form of business services or IT-enabled (Web) Services have become a corporate asset of high interest in striving towards the agile organisation. However, while the design and management of a single service is widely studied and well understood, little is known about how a set of services can be managed. This gap motivated this paper, in which we explore the concept of Service Portfolio Management. In particular, we propose a Service Portfolio Management Framework that explicates service portfolio goals, tasks, governance issues, methods and enablers. The Service Portfolio Management Framework is based upon a thorough analysis and consolidation of existing, well-established portfolio management approaches. From an academic point of view, the Service Portfolio Management Framework can be positioned as an extension of portfolio management conceptualisations in the area of service management. Based on the framework, possible directions for future research are provided. From a practical point of view, the Service Portfolio Management Framework provides an organisation with a novel approach to managing its emerging service portfolios.