969 resultados para level set method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is to determine and discuss on the plant and machinery valuation syllabus for higher learning education in Malaysia to ensure the practicality of the subject in the real market. There have been limited studies in plant and machinery area, either by scholars or practitioners. Most papers highlighted the methodologies but limited papers discussed on the plant and machinery valuation education. This paper will determine inputs for plant and machinery valuation guidance focussing on the syllabus set up and references for valuers interested in this area of expertise. A qualitative approach via content analysis is conducted to compare international and Malaysian plant and machinery valuation syllabus and suggest improvements for Malaysian syllabus. It is found that there are few higher education institutions in the world that provide plant and machinery valuation courses as part of their property studies syllabus. Further investigation revealed that on the job training is the preferable method for plant and machinery valuation education and based on the valuers experience. The significance of this paper is to increase the level of understanding of plant and machinery valuation criteria and provide suggestions to Malaysian stakeholders with the relevant elements in plant and machinery valuation education syllabus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – As a consequence of rapid urbanisation and globalisation, cities have become the engines of population and economic growth. Hence, natural resources in and around the cities have been exposed to externalities of urban development processes. This paper introduces a new sustainability assessment approach that is tested in a pilot study. The paper aims to assist policy-makers and planners investigating the impacts of development on environmental systems, and produce effective policies for sustainable urban development. Design/methodology/approach – The paper introduces an indicator-based indexing model entitled “Indexing Model for the Assessment of Sustainable Urban Ecosystems” (ASSURE). The ASSURE indexing model produces a set of micro-level environmental sustainability indices that is aimed to be used in the evaluation and monitoring of the interaction between human activities and urban ecosystems. The model is an innovative approach designed to assess the resilience of ecosystems towards impacts of current development plans and the results serve as a guide for policymakers to take actions towards achieving sustainability. Findings – The indexing model has been tested in a pilot case study within the Gold Coast City, Queensland, Australia. This paper presents the methodology of the model and outlines the preliminary findings of the pilot study. The paper concludes with a discussion on the findings and recommendations put forward for future development and implementation of the model. Originality/value – Presently, there is a few sustainability indices developed to measure the sustainability at local, regional, national and international levels. However, due to challenges in data collection difficulties and availability of local data, there is no effective assessment model at the microlevel that the assessment of urban ecosystem sustainability accurately. The model introduced in this paper fills this gap by focusing on parcel-scale and benchmarking the environmental performance in micro-level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aim: This paper aims to explore new graduates experience working with clients with mental health issues using critical incident interviews. Methods: The qualitative research techniques were based on phenomenology. A purposive sample of 19 new graduate dietitians was drawn from a range of work settings and locations throughout Australia. Data was gathered using thirty minute Critical Incident Interviews. Audio-taped data was transcribed, coded to identify common themes, compared for congruence and then categorised into knowledge, skills and attitudes. Results: New graduates encountered a range of situations involving a variety of mental health, wellbeing, dietetic and clinical issues. Common themes highlighted the mental health knowledge, skills and attitudes required for entry-level dietitians which then informed the review of the National Competency Standards for Entry-Level Dietitians. Conclusion: New graduates encounter a variety of mental health and wellbeing issues in their everyday practice and therefore require training to address these situations competently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate(subject pool) level the results are (roughly) consistent, on an individual(within-subject) level,behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining e�ects in other experimental games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Content analysis of text offers a method for exploring experiences which usually remain unquestioned and unexamined. In this paper the authors analyse a set of patient progress notes by re-framing them as a narrative account of a significant event in the experience of a patient, her family and attending health care workers. Examination of these notes provides insights into aspects of clinical practice which are usually dealt with at a taken-for-granted level. An interpretation of previously unexamined therapeutic practices within the social and political context of institutional health care is offered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system restoration after a large area outage involves many factors, and the procedure is usually very complicated. A decision-making support system could then be developed so as to find the optimal black-start strategy. In order to evaluate candidate black-start strategies, some indices, usually both qualitative and quantitative, are employed. However, it may not be possible to directly synthesize these indices, and different extents of interactions may exist among these indices. In the existing black-start decision-making methods, qualitative and quantitative indices cannot be well synthesized, and the interactions among different indices are not taken into account. The vague set, an extended version of the well-developed fuzzy set, could be employed to deal with decision-making problems with interacting attributes. Given this background, the vague set is first employed in this work to represent the indices for facilitating the comparisons among them. Then, a concept of the vague-valued fuzzy measure is presented, and on that basis a mathematical model for black-start decision-making developed. Compared with the existing methods, the proposed method could deal with the interactions among indices and more reasonably represent the fuzzy information. Finally, an actual power system is served for demonstrating the basic features of the developed model and method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Non-fatal health outcomes from diseases and injuries are a crucial consideration in the promotion and monitoring of individual and population health. The Global Burden of Disease (GBD) studies done in 1990 and 2000 have been the only studies to quantify non-fatal health outcomes across an exhaustive set of disorders at the global and regional level. Neither effort quantified uncertainty in prevalence or years lived with disability (YLDs). Methods Of the 291 diseases and injuries in the GBD cause list, 289 cause disability. For 1160 sequelae of the 289 diseases and injuries, we undertook a systematic analysis of prevalence, incidence, remission, duration, and excess mortality. Sources included published studies, case notification, population-based cancer registries, other disease registries, antenatal clinic serosurveillance, hospital discharge data, ambulatory care data, household surveys, other surveys, and cohort studies. For most sequelae, we used a Bayesian meta-regression method, DisMod-MR, designed to address key limitations in descriptive epidemiological data, including missing data, inconsistency, and large methodological variation between data sources. For some disorders, we used natural history models, geospatial models, back-calculation models (models calculating incidence from population mortality rates and case fatality), or registration completeness models (models adjusting for incomplete registration with health-system access and other covariates). Disability weights for 220 unique health states were used to capture the severity of health loss. YLDs by cause at age, sex, country, and year levels were adjusted for comorbidity with simulation methods. We included uncertainty estimates at all stages of the analysis. Findings Global prevalence for all ages combined in 2010 across the 1160 sequelae ranged from fewer than one case per 1 million people to 350 000 cases per 1 million people. Prevalence and severity of health loss were weakly correlated (correlation coefficient −0·37). In 2010, there were 777 million YLDs from all causes, up from 583 million in 1990. The main contributors to global YLDs were mental and behavioural disorders, musculoskeletal disorders, and diabetes or endocrine diseases. The leading specific causes of YLDs were much the same in 2010 as they were in 1990: low back pain, major depressive disorder, iron-deficiency anaemia, neck pain, chronic obstructive pulmonary disease, anxiety disorders, migraine, diabetes, and falls. Age-specific prevalence of YLDs increased with age in all regions and has decreased slightly from 1990 to 2010. Regional patterns of the leading causes of YLDs were more similar compared with years of life lost due to premature mortality. Neglected tropical diseases, HIV/AIDS, tuberculosis, malaria, and anaemia were important causes of YLDs in sub-Saharan Africa. Interpretation Rates of YLDs per 100 000 people have remained largely constant over time but rise steadily with age. Population growth and ageing have increased YLD numbers and crude rates over the past two decades. Prevalences of the most common causes of YLDs, such as mental and behavioural disorders and musculoskeletal disorders, have not decreased. Health systems will need to address the needs of the rising numbers of individuals with a range of disorders that largely cause disability but not mortality. Quantification of the burden of non-fatal health outcomes will be crucial to understand how well health systems are responding to these challenges. Effective and affordable strategies to deal with this rising burden are an urgent priority for health systems in most parts of the world. Funding Bill & Melinda Gates Foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-world AI systems have been recently deployed which can automatically analyze the plan and tactics of tennis players. As the game-state is updated regularly at short intervals (i.e. point-level), a library of successful and unsuccessful plans of a player can be learnt over time. Given the relative strengths and weaknesses of a player’s plans, a set of proven plans or tactics from the library that characterize a player can be identified. For low-scoring, continuous team sports like soccer, such analysis for multi-agent teams does not exist as the game is not segmented into “discretized” plays (i.e. plans), making it difficult to obtain a library that characterizes a team’s behavior. Additionally, as player tracking data is costly and difficult to obtain, we only have partial team tracings in the form of ball actions which makes this problem even more difficult. In this paper, we propose a method to overcome these issues by representing team behavior via play-segments, which are spatio-temporal descriptions of ball movement over fixed windows of time. Using these representations we can characterize team behavior from entropy maps, which give a measure of predictability of team behaviors across the field. We show the efficacy and applicability of our method on the 2010-2011 English Premier League soccer data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Curriculum documents for mathematics emphasise the importance of promoting depth of knowledge rather than shallow coverage of the curriculum. In this paper, we report on a study that explored the analysis of junior secondary mathematics textbooks to assess their potential to assist in teaching and learning aimed at building and applying deep mathematical knowledge. The method of analysis involved the establishment of a set of specific curriculum goals and associated indicators, based on research into the teaching and learning of a particular field within the mathematics curriculum, namely proportion and proportional reasoning. Topic selection was due to its pervasive nature throughout the school mathematics curriculum at this level. As a result of this study, it was found that the five textbook series examined provided limited support for the development of multiplicative structures required for proportional reasoning, and hence would not serve well the development of deep learning of mathematics. The study demonstrated a method that could be applied to the analysis of junior secondary mathematics in many parts of the world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine factors (internal and external) that influenced Canadian provincial (state) politicians when making funding decisions about public libraries. Using the case study methodology, Canadian provincial/state level funding for public libraries in the 2009-10 fiscal year was examined. After reviewing funding levels across the country, three jurisdictions were chosen for the case: British Columbia's budget revealed dramatically decreased funding, Alberta's budget showed dramatically increased funding, and Ontario's budget was unchanged from the previous year. The primary source of data for the case was a series of semi-structured interviews with elected officials and senior bureaucrats from the three jurisdictions. An examination of primary and secondary documents was also undertaken to help set the political and economic context as well as to provide triangulation for the case interviews. The data were analysed to determine whether Cialdini's theory of influence (2001) and specifically any of the six tactics of influence (i.e, commitment and consistency, authority, liking, social proof, scarcity and reciprocity) were instrumental in these budget processes. Findings show the principles of "authority", "consistency and commitment" and "liking" were relevant, and that "liking" were especially important to these decisions. When these decision makers were considering funding for public libraries, they most often used three distinct lenses: the consistency lens (what are my values? what would my party do?), the authority lens (is someone with hierarchical power telling me to do this? are the requests legitimate?), and most importantly, the liking lens (how much do I like and know about the requester?). These findings are consistent with Cialdini's theory, which suggests the quality of some relationships is one of six factors that can most influence a decision maker. The small number of prior research studies exploring the reasons for increases or decreases in public library funding allocation decisions have given little insight into the factors that motivate those politicians involved in the process and the variables that contribute to these decisions. No prior studies have examined the construct of influence in decision making about funding for Canadian public libraries at any level of government. Additionally, no prior studies have examined the construct of influence in decision making within the context of Canadian provincial politics. While many public libraries are facing difficult decisions in the face of uncertain funding futures, the ability of the sector to obtain favourable responses to requests for increases may require a less simplistic approach than previously thought. The ability to create meaningful connections with individuals in many communities and across all levels of government should be emphasised as a key factor in influencing funding decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to develop more inclusive products and services, designers need a means of assessing the inclusivity of existing products and new concepts. Following previous research on the development of scales for inclusive design at University of Cambridge, Engineering Design Centre (EDC) [1], this paper presents the latest version of the exclusion audit method. For a specific product interaction, this estimates the proportion of the Great British population who would be excluded from using a product or service, due to the demands the product places on key user capabilities. A critical part of the method involves rating of the level of demand placed by a task on a range of key user capabilities, so the procedure to perform this assessment was operationalised and then its reliability was tested with 31 participants. There was no evidence that participants rated the same demands consistently. The qualitative results from the experiment suggest that the consistency of participants’ demand level ratings could be significantly improved if the audit materials and their instructions better guided the participant through the judgement process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent fire research into the behaviour of light gauge steel frame (LSF) wall systems has devel-oped fire design rules based on Australian and European cold-formed steel design standards, AS/NZS 4600 and Eurocode 3 Part 1.3. However, these design rules are complex since the LSF wall studs are subjected to non-uniform elevated temperature distributions when the walls are exposed to fire from one side. Therefore this paper proposes an alternative design method for routine predictions of fire resistance rating of LSF walls. In this method, suitable equations are recommended first to predict the idealised stud time-temperature pro-files of eight different LSF wall configurations subject to standard fire conditions based on full scale fire test results. A new set of equations was then proposed to find the critical hot flange (failure) temperature for a giv-en load ratio for the same LSF wall configurations with varying steel grades and thickness. These equations were developed based on detailed finite element analyses that predicted the axial compression capacities and failure times of LSF wall studs subject to non-uniform temperature distributions with varying steel grades and thicknesses. This paper proposes a simple design method in which the two sets of equations developed for time-temperature profiles and critical hot flange temperatures are used to find the failure times of LSF walls. The proposed method was verified by comparing its predictions with the results from full scale fire tests and finite element analyses. This paper presents the details of this study including the finite element models of LSF wall studs, the results from relevant fire tests and finite element analyses, and the proposed equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study x-ray CT has been used to produce a 3D image of an irradiated PAGAT gel sample, with noise-reduction achieved using the ‘zero-scan’ method. The gel was repeatedly CT scanned and a linear fit to the varying Hounsfield unit of each pixel in the 3D volume was evaluated across the repeated scans, allowing a zero-scan extrapolation of the image to be obtained. To minimise heating of the CT scanner’s x-ray tube, this study used a large slice thickness (1 cm), to provide image slices across the irradiated region of the gel, and a relatively small number of CT scans (63), to extrapolate the zero-scan image. The resulting set of transverse images shows reduced noise compared to images from the initial CT scan of the gel, without being degraded by the additional radiation dose delivered to the gel during the repeated scanning. The full, 3D image of the gel has a low spatial resolution in the longitudinal direction, due to the selected scan parameters. Nonetheless, important features of the dose distribution are apparent in the 3D x-ray CT scan of the gel. The results of this study demonstrate that the zero-scan extrapolation method can be applied to the reconstruction of multiple x-ray CT slices, to provide useful 2D and 3D images of irradiated dosimetry gels.