59 resultados para General circulation models
Resumo:
In 1990 the Dispute Resolution Centres Act, 1990 (Qld) (the Act) was passed by the Queensland Parliament. In the second reading speech for the Dispute Resolution Centres Bill on May 1990 the Hon Dean Wells stated that the proposed legislation would make mediation services available “in a non-coercive, voluntary forum where, with the help of trained mediators, the disputants will be assisted towards their own solutions to their disputes, thereby ensuring that the result is acceptable to the parties” (Hansard, 1990, 1718). It was recognised at that time that a method for resolving disputes was necessary for which “the conventional court system is not always equipped to provide lasting resolution” (Hansard, 1990, 1717). In particular, the lasting resolution of “disputes between people in continuing relationships” was seen as made possible through the new legislation; for example, “domestic disputes, disputes between employees, and neighbourhood disputes relating to such issues as overhanging tree branches, dividing fences, barking dogs, smoke, noise and other nuisances are occurring continually in the community” (Hansard, 1990, 1717). The key features of the proposed form of mediation in the Act were articulated as follows: “attendance of both parties at mediation sessions is voluntary; a party may withdraw at any time; mediation sessions will be conducted with as little formality and technicality as possible; the rules of evidence will not apply; any agreement reached is not enforceable in any court; although it could be made so if the parties chose to proceed that way; and the provisions of the Act do not affect any rights or remedies that a party to a dispute has apart from the Act” (Hansard, 1990, 1718). Since the introduction of the Act, the Alternative Dispute Resolution Branch of the Queensland Department of Justice and Attorney General has offered mediation services through, first the Community Justice Program (CJP), and then the Dispute Resolution Centres (DRCs) for a range of family, neighbourhood, workplace and community disputes. These services have mirrored those available through similar government agencies in other states such as the Community Justice Centres of NSW and the Victorian Dispute Resolution Centres. Since 1990, mediation has become one of the fastest growing forms of alternative dispute resolution (ADR). Sourdin has commented that "In addition to the growth in court-based and community-based dispute resolution schemes, ADR has been institutionalised and has grown within Australia and overseas” (2005, 14). In Australia, in particular, the development of ADR service provision “has been assisted by the creation and growth of professional organisations such as the Leading Edge Alternative Dispute Resolvers (LEADR), the Australian Commercial Dispute Centres (ACDC), Australian Disputes Resolution Association (ADRA), Conflict Resolution Network, and the Institute of Arbitrators and Mediators Australia (IAMA)” (Sourdin, 2005, 14). The increased emphasis on the use of ADR within education contexts (particularly secondary and tertiary contexts) has “also led to an increasing acceptance and understanding of (ADR) processes” (Sourdin, 2005, 14). Proponents of the mediation process, in particular, argue that much of its success derives from the inherent flexibility and creativity of the agreements reached through the mediation process and that it is a relatively low cost option in many cases (Menkel-Meadow, 1997, 417). It is also accepted that one of the main reasons for the success of mediation can be attributed to the high level of participation by the parties involved and thus creating a sense of ownership of, and commitment to, the terms of the agreement (Boulle, 2005, 65). These characteristics are associated with some of the core values of mediation, particularly as practised in community-based models as found at the DRCs. These core values include voluntary participation, party self-determination and party empowerment (Boulle, 2005, 65). For this reason mediation is argued as being an effective approach to resolving disputes, that creates a lasting resolution of the issues. Evaluation of the mediation process, particularly in the context of the growth of ADR, has been an important aspect of the development of the process (Sourdin, 2008). Writing in 2005 for example, Boulle, states that “although there is a constant refrain for more research into mediation practice, there has been a not insignificant amount of mediation measurement, both in Australia and overseas” (Boulle, 2005, 575). The positive claims of mediation have been supported to a significant degree by evaluations of the efficiency and effectiveness of the process. A common indicator of the effectiveness of mediation is the settlement rate achieved. High settlement rates for mediated disputes have been found for Australia (Altobelli, 2003) and internationally (Alexander, 2003). Boulle notes that mediation agreement rates claimed by service providers range from 55% to 92% (Boulle, 2005, 590). The annual reports for the Alternative Dispute Resolution Branch of the Queensland Department of Justice and Attorney-General considered prior to the commencement of this study indicated generally achievement of an approximate settlement figure of 86% by the Queensland Dispute Resolution Centres. More recently, the 2008-2009 annual report states that of the 2291 civil dispute mediated in 2007-2008, 86% reached an agreement. Further, of the 2693 civil disputes mediated in 2008-2009, 73% reached an agreement. These results are noted in the report as indicating “the effectiveness of mediation in resolving disputes” and as reflecting “the high level of agreement achieved for voluntary mediations” (Annual Report, 2008-2009, online). Whilst the settlement rates for the DRCs are strong, parties are rarely contacted for long term follow-up to assess whether agreements reached during mediation lasted to the satisfaction of each party. It has certainly been the case that the Dispute Resolution Centres of Queensland have not been resourced to conduct long-term follow-up assessments of mediation agreements. As Wade notes, "it is very difficult to compare "success" rates” and whilst “politicians want the comparison studies (they) usually do not want the delay and expense of accurate studies" (1998, 114). To date, therefore, it is fair to say that the efficiency of the mediation process has been evaluated but not necessarily its effectiveness. Rather, the practice at the Queensland DRCs has been to evaluate the quality of mediation service provision and of the practice of the mediation process. This has occurred, for example, through follow-up surveys of parties' satisfaction rates with the mediation service. In most other respects it is fair to say that the Centres have relied on the high settlement rates of the mediation process as a sign of the effectiveness of mediation (Annual Reports 1991 - 2010). Research of the mediation literature conducted for the purpose of this thesis has also indicated that there is little evaluative literature that provides an in-depth analysis and assessment of the longevity of mediated agreements. Instead evaluative studies of mediation tend to assess how mediation is conducted, or compare mediation with other conflict resolution options, or assess the agreement rate of mediations, including parties' levels of satisfaction with the service provision of the dispute resolution service provider (Boulle, 2005, Chapter 16).
Resumo:
The quality of conceptual business process models is highly relevant for the design of corresponding information systems. In particular, a precise measurement of model characteristics can be beneficial from a business perspective, helping to save costs thanks to early error detection. This is just as true from a software engineering point of view. In this latter case, models facilitate stakeholder communication and software system design. Research has investigated several proposals as regards measures for business process models, from a rather correlational perspective. This is helpful for understanding, for example size and complexity as general driving forces of error probability. Yet, design decisions usually have to build on thresholds, which can reliably indicate that a certain counter-action has to be taken. This cannot be achieved only by providing measures; it requires a systematic identification of effective and meaningful thresholds. In this paper, we derive thresholds for a set of structural measures for predicting errors in conceptual process models. To this end, we use a collection of 2,000 business process models from practice as a means of determining thresholds, applying an adaptation of the ROC curves method. Furthermore, an extensive validation of the derived thresholds was conducted by using 429 EPC models from an Australian financial institution. Finally, significant thresholds were adapted to refine existing modeling guidelines in a quantitative way.
Resumo:
The introduction of the Australian curriculum, the use of standardised testing (e.g. NAPLAN) and the My School website have stimulated and in some cases renewed a range of boundaries for young people in Australian Education. Standardised testing has accentuated social reproduction in education with an increase in the numbers of students disengaging from mainstream education and applying for enrolment at the Edmund Rice Education Australia Flexible Learning Centre Network (EREAFLCN). Many young people are denied access to credentials and certification as they become excluded from standardised education and testing. The creativity and skills of marginalised youth are often evidence of general capabilities and yet do not appear to be recognised in mainstream educational institutions when standardised approaches are adopted. Young people who participate at the EREAFLCN arrive with a variety of forms of cultural capital, frequently utilising general capabilities, which are not able to be valued in current education and employment fields. This is not to say that these young people‟s different forms of cultural capital have no value, but rather that such funds of knowledge, repertoires and cultural capital are not valued by the majority of powerful agents in educational and employment fields. How then can the inherent value of traditionally unorthodox - yet often intricate, ingenious, and astute-versions of cultural capital evident in the habitus of many young people be made to count, be recognised, be valuated?Can a process of educational assessment be a field of capital exchange and a space which crosses boundaries through a valuating process? This paper reports on the development of an innovative approach to assessment in an alternative education institution designed for the re engagement of „at risk‟ youth who have left formal schooling. A case study approach has been used to document the engagement of six young people, with an educational approach described as assessment for learning as a field of exchange across two sites in the EREAFLCN. In order to capture the broad range of students‟ cultural and social capital, an electronic portfolio system (EPS) is under trial. The model draws on categories from sociological models of capital and reconceptualises the eportfolio as a sociocultural zone of learning and development. Results from the trial show a general tendency towards engagement with the EPS and potential for the attainment of socially valued cultural capital in the form of school credentials. In this way restrictive boundaries can be breached and a more equitable outcome achieved for many young Australians.
Resumo:
This paper discusses the conceptualization, implementation and initial findings of a professional learning program (PLP) which used LEGO® robotics as one of the tools for teaching general technology (GT)in China’s secondary schools. The program encouraged teachers to design learning environments that can be realistic, authentic, engaging and fun. 100 general technology teachers from high schools in 30 provinces of China participated. The program aimed to transform teacher classroom practice, change their beliefs and attitudes, allow teachers to reflect deeply on what they do and in turn to provide their students with meaningful learning. Preliminary findings indicate that these teachers had a huge capacity for change. They were open-minded and absorbed new ways of learning and teaching. They became designers who developed innovative models of learning which incorporated learning processes that effectively used LEGO® robotics as one of the more creative tools for teaching GT.
Resumo:
This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.
Resumo:
Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.
Resumo:
The presence of large number of single-phase distributed energy resources (DERs) can cause severe power quality problems in distribution networks. The DERs can be installed in random locations. This may cause the generation in a particular phase exceeds the load demand in that phase. Therefore the excess power in that phase will be fed back to the transmission network. To avoid this problem, the paper proposes the use of distribution static compensator (DSTATCOM) that needs to be connected at the first bus following a substation. When operated properly, the DSTATCOM can facilitate a set of balanced current flow from the substation, even when excess power is generated by DERs. The proposals are validated through extensive digital computer simulation studies using PSCAD and MATLAB.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Identifying the design features that impact construction is essential to developing cost effective and constructible designs. The similarity of building components is a critical design feature that affects method selection, productivity, and ultimately construction cost and schedule performance. However, there is limited understanding of what constitutes similarity in the design of building components and limited computer-based support to identify this feature in a building product model. This paper contributes a feature-based framework for representing and reasoning about component similarity that builds on ontological modelling, model-based reasoning and cluster analysis techniques. It describes the ontology we developed to characterize component similarity in terms of the component attributes, the direction, and the degree of variation. It also describes the generic reasoning process we formalized to identify component similarity in a standard product model based on practitioners' varied preferences. The generic reasoning process evaluates the geometric, topological, and symbolic similarities between components, creates groupings of similar components, and quantifies the degree of similarity. We implemented this reasoning process in a prototype cost estimating application, which creates and maintains cost estimates based on a building product model. Validation studies of the prototype system provide evidence that the framework is general and enables a more accurate and efficient cost estimating process.
Resumo:
In this paper we show that industry-based student training is not limited to work experience; work integrated learning, internship or extended vacation work. It is also about bringing back the lost parts of technological education. We experience the unilateral focus on theoretical knowledge at the expense of skills and general competences as one important challenge in technological education. The lacking facilitation and training of practical skills and general competences in the curricula and programs are identified, but many institutions have failed to address the problem. Today’s curricula in many ways reduce technology to abstract concepts, calculations and models, and create a gap between the academic programs and the practical applications in the society. We explore two (Australia and Norway) initiatives on industry-based student training and discuss how these initiatives address and bridge the gap. We argue that these initiatives of industry-based student training contribute to bringing skills and general competences back into technological education, and that the effects are not limited to increased employability, but also include increased academic performance.
Resumo:
Migraine is a common, heterogeneous and heritable neurological disorder. Its pathophysiology is incompletely understood, and its genetic influences at the population level are unknown. In a population-based genome-wide analysis including 5,122 migraineurs and 18,108 non-migraineurs, rs2651899 (1p36.32, PRDM16), rs10166942 (2q37.1, TRPM8) and rs11172113 (12q13.3, LRP1) were among the top seven associations (P < 5 × 10(-6)) with migraine. These SNPs were significant in a meta-analysis among three replication cohorts and met genome-wide significance in a meta-analysis combining the discovery and replication cohorts (rs2651899, odds ratio (OR) = 1.11, P = 3.8 × 10(-9); rs10166942, OR = 0.85, P = 5.5 × 10(-12); and rs11172113, OR = 0.90, P = 4.3 × 10(-9)). The associations at rs2651899 and rs10166942 were specific for migraine compared with non-migraine headache. None of the three SNP associations was preferential for migraine with aura or without aura, nor were any associations specific for migraine features. TRPM8 has been the focus of neuropathic pain models, whereas LRP1 modulates neuronal glutamate signaling, plausibly linking both genes to migraine pathophysiology.
Resumo:
Topic modelling, such as Latent Dirichlet Allocation (LDA), was proposed to generate statistical models to represent multiple topics in a collection of documents, which has been widely utilized in the fields of machine learning and information retrieval, etc. But its effectiveness in information filtering is rarely known. Patterns are always thought to be more representative than single terms for representing documents. In this paper, a novel information filtering model, Pattern-based Topic Model(PBTM) , is proposed to represent the text documents not only using the topic distributions at general level but also using semantic pattern representations at detailed specific level, both of which contribute to the accurate document representation and document relevance ranking. Extensive experiments are conducted to evaluate the effectiveness of PBTM by using the TREC data collection Reuters Corpus Volume 1. The results show that the proposed model achieves outstanding performance.
Resumo:
Whole image descriptors have recently been shown to be remarkably robust to perceptual change especially compared to local features. However, whole-image-based localization systems typically rely on heuristic methods for determining appropriate matching thresholds in a particular environment. These environment-specific tuning requirements and the lack of a meaningful interpretation of these arbitrary thresholds limits the general applicability of these systems. In this paper we present a Bayesian model of probability for whole-image descriptors that can be seamlessly integrated into localization systems designed for probabilistic visual input. We demonstrate this method using CAT-Graph, an appearance-based visual localization system originally designed for a FAB-MAP-style probabilistic input. We show that using whole-image descriptors as visual input extends CAT-Graph’s functionality to environments that experience a greater amount of perceptual change. We also present a method of estimating whole-image probability models in an online manner, removing the need for a prior training phase. We show that this online, automated training method can perform comparably to pre-trained, manually tuned local descriptor methods.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
Objective To describe women’s reports of the model of care options General Practitioners (GPs) discussed with them at the first pregnancy consultation and women’s self-reported role in decisionmaking about model of care. Methods Women who had recently given birth responded to survey items about the models of care GPs discussed, their role in final decision-making, and socio-demographic, obstetric history, and early pregnancy characteristics. Results The proportion of women with whom each model of care was discussed varied between 8.2% (for private midwifery care with home birth) and 64.4% (GP shared care). Only 7.7% of women reported that all seven models were discussed. Exclusive discussion about private obstetric care and about all public models was common, and women’s health insurance status was the strongest predictor of the presence of discussions about each model. Most women (82.6%) reported active involvement in final decision-making about model of care. Conclusion Although most women report involvement in maternity model of care decisions, they remain largely uninformed about the breadth of available model of care options. Practical implications Strategies that facilitate women’s access to information on the differentiating features and outcomes for all models of care should be prioritized to better ensure equitable and quality decisions.