310 resultados para Proven Reserves


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Environmental impacts caused during Australia's comparatively recent settlement by Europeans are evident. Governments (both Commonwealth and States) have been largely responsible for requiring landholders – through leasehold development conditions and taxation concessions – to conduct clearing that is now perceived as damage. Most governments are now demanding resource protection. There is a measure of bewilderment (if not resentment) among landholders because of this change. The more populous States, where most overall damage has been done (i.e. Victoria and New South Wales), provide most support for attempts to stop development in other regions where there has been less damage. Queensland, i.e. the north-eastern quarter of the continent, has been relatively slow to develop. It also holds the largest and most diverse natural environments. Tree clearing is an unavoidable element of land development, whether to access and enhance native grasses for livestock or to allow for urban developments (with exotic tree plantings). The consequences in terms of regulations are particularly complex because of the dynamic nature of vegetation. The regulatory terms used in current legislation – such as 'Endangered' and 'Of concern' – depend on legally-defined, static baselines. Regrowth and fire damage are two obvious causes of change. A less obvious aspect is succession, where ecosystems change naturally over long timeframes. In the recent past, the Queensland Government encouraged extensive tree-clearing e.g. through the State Brigalow Development Scheme (mostly 1962 to 1975) which resulted in the removal of some 97% of the wide-ranging mature forests of Acacia harpophylla. At the same time, this government controls National Parks and other reservations (occupying some 4% of the State's 1.7 million km2 area) and also holds major World Heritage Areas (such as the Great Barrier Reef and the Wet Tropics Rainforest) promulgated under Commonwealth legislation. This is a highly prescriptive approach, where the community is directed on the one hand to develop (largely through lease conditions) and on the other to avoid development (largely by unusable reserves). Another approach to development and conservation is still possible in Queensland. For this to occur, however, a more workable and equitable solution than has been employed to date is needed, especially for the remote lands of this State. This must involve resident landholders, who have the capacity (through local knowledge, infrastructure and daily presence) to undertake most costeffectively sustainable land-use management (with suitable attention to ecosystems requiring special conservation effort), that is, provided they have the necessary direction, encouragement and incentive to do so.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A graduate destination survey can provide a snap shot in time of a graduate’s career progression and outcome. This paper will present the results of a Queensland University of Technology study exploring the employment outcomes of students who had completed a library and information science course from the Faculty of Information Technology between 2000 and 2008. Seventy-four graduates completed an online questionnaire administered in July 2009. The study found that 90% of the graduates surveyed were working and living in Queensland, with over three quarters living and working in Brisbane. Nearly 70% were working full-time, while only 1.4% indicating that they were unemployed and looking for work. Over 80% of the graduates identified themselves as working in “librarianship”. This study is the first step in understanding the progression and destination of QUT’s library and information science graduates. It is recommended that this survey becomes an ongoing initiative so that the results can be analysed and compared over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 5th International Conference on Field and Service Robotics (FSR05) was held in Port Douglas, Australia, on 29th - 31st July 2005, and brought together the worlds' leading experts in field and service automation. The goal of the conference was to report and encourage the latest research and practical results towards the use of field and service robotics in the community with particular focus on proven technology. The conference provided a forum for researchers, professionals and robot manufacturers to exchange up-to-date technical knowledge and experience. Field robots are robots which operate in outdoor, complex, and dynamic environments. Service robots are those that work closely with humans, with particular applications involving indoor and structured environments. There are a wide range of topics presented in this issue on field and service robots including: Agricultural and Forestry Robotics, Mining and Exploration Robots, Robots for Construction, Security & Defence Robots, Cleaning Robots, Autonomous Underwater Vehicles and Autonomous Flying Robots. This meeting was the fifth in the series and brings FSR back to Australia where it was first held. FSR has been held every 2 years, starting with Canberra 1997, followed by Pittsburgh 1999, Helsinki 2001 and Lake Yamanaka 2003.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, the numerical modelling and simulation for anomalous subdiffusion equation (ASDE), which is a type of fractional partial differential equation( FPDE) and has been found with widely applications in modern engineering and sciences, are attracting more and more attentions. The current dominant numerical method for modelling ASDE is Finite Difference Method (FDM), which is based on a pre-defined grid leading to inherited issues or shortcomings. This paper aims to develop an implicit meshless approach based on the radial basis functions (RBF) for numerical simulation of the non-linear ASDE. The discrete system of equations is obtained by using the meshless shape functions and the strong-forms. The stability and convergence of this meshless approach are then discussed and theoretically proven. Several numerical examples with different problem domains are used to validate and investigate accuracy and efficiency of the newly developed meshless formulation. The results obtained by the meshless formulations are also compared with those obtained by FDM in terms of their accuracy and efficiency. It is concluded that the present meshless formulation is very effective for the modeling and simulation of the ASDE. Therefore, the meshless technique should have good potential in development of a robust simulation tool for problems in engineering and science which are governed by the various types of fractional differential equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Length of hospital stay (LOS) is a surrogate marker for patients' well-being during hospital treatment and is associated with health care costs. Identifying pretreatment factors associated with LOS in surgical patients may enable early intervention in order to reduce postoperative LOS. Methods This cohort study enrolled 157 patients with suspected or proven gynecological cancer at a tertiary cancer centre (2004-2006). Before commencing treatment, the scored Patient Generated - Subjective Global Assessment (PG-SGA) measuring nutritional status and the Functional Assessment of Cancer Therapy-General (FACT-G) scale measuring quality of life (QOL) were completed. Clinical and demographic patient characteristics were prospectively obtained. Patients were grouped into those with prolonged LOS if their hospital stay was greater than the median LOS and those with average or below average LOS. Results Patients' mean age was 58 years (SD 14 years). Preoperatively, 81 (52%) patients presented with suspected benign disease/pelvic mass, 23 (15%) with suspected advanced ovarian cancer, 36 (23%) patients with suspected endometrial and 17 (11%) with cervical cancer, respectively. In univariate models prolonged LOS was associated with low serum albumin or hemoglobin, malnutrition (PG-SGA score and PG-SGA group B or C), low pretreatment FACT-G score, and suspected diagnosis of cancer. In multivariable models, PG-SGA group B or C, FACT-G score and suspected diagnosis of advanced ovarian cancer independently predicted LOS. Conclusions Malnutrition, low quality of life scores and being diagnosed with advanced ovarian cancer are the major determinants of prolonged LOS amongst gynecological cancer patients. Interventions addressing malnutrition and poor QOL may decrease LOS in gynecological cancer patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geographic information is increasingly being touted for use in research and industrial projects. While the technology is now available and affordable, there is a lack of easy to use software that takes advantage of geographic information. This is an important problem because users are often researchers or scientists who have insufficient software skills, and by providing applications that are easier to use, time and financial resources can be taken from training and be better applied to the actual research and development work. A solution for this problem must cater for the user and research needs. In particular it must allow for mobile operation for fieldwork, flexibility or customisability of data input, sharing of data with other tools and collaborative capabilities for the usual teamwork environment. This thesis has developed a new architecture and data model to achieve the solution. The result is the Mobile Collaborative Annotation framework providing an implementation of the new architecture and data model. Mobile Collaborative Mapping implements the framework as a Web 2.0 mashup rich internet application and has proven to be an effective solution through its positive application to a case study with fieldwork scientists. This thesis has contributed to research into mobile computing, collaborative computing and geospatial systems by creating a simpler entry point to mobile geospatial applications, enabling simplified collaboration and providing tangible time savings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a new method for winding configuration in planar magnetic elements with more than two layers. It has been proven by 3D Finite Element method and mathematical modeling that this suggested configuration results in reduction of the equivalent capacitive coupling in the planar inductor

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A group key exchange (GKE) protocol allows a set of parties to agree upon a common secret session key over a public network. In this thesis, we focus on designing efficient GKE protocols using public key techniques and appropriately revising security models for GKE protocols. For the purpose of modelling and analysing the security of GKE protocols we apply the widely accepted computational complexity approach. The contributions of the thesis to the area of GKE protocols are manifold. We propose the first GKE protocol that requires only one round of communication and is proven secure in the standard model. Our protocol is generically constructed from a key encapsulation mechanism (KEM). We also suggest an efficient KEM from the literature, which satisfies the underlying security notion, to instantiate the generic protocol. We then concentrate on enhancing the security of one-round GKE protocols. A new model of security for forward secure GKE protocols is introduced and a generic one-round GKE protocol with forward security is then presented. The security of this protocol is also proven in the standard model. We also propose an efficient forward secure encryption scheme that can be used to instantiate the generic GKE protocol. Our next contributions are to the security models of GKE protocols. We observe that the analysis of GKE protocols has not been as extensive as that of two-party key exchange protocols. Particularly, the security attribute of key compromise impersonation (KCI) resilience has so far been ignored for GKE protocols. We model the security of GKE protocols addressing KCI attacks by both outsider and insider adversaries. We then show that a few existing protocols are not secure against KCI attacks. A new proof of security for an existing GKE protocol is given under the revised model assuming random oracles. Subsequently, we treat the security of GKE protocols in the universal composability (UC) framework. We present a new UC ideal functionality for GKE protocols capturing the security attribute of contributiveness. An existing protocol with minor revisions is then shown to realize our functionality in the random oracle model. Finally, we explore the possibility of constructing GKE protocols in the attribute-based setting. We introduce the concept of attribute-based group key exchange (AB-GKE). A security model for AB-GKE and a one-round AB-GKE protocol satisfying our security notion are presented. The protocol is generically constructed from a new cryptographic primitive called encapsulation policy attribute-based KEM (EP-AB-KEM), which we introduce in this thesis. We also present a new EP-AB-KEM with a proof of security assuming generic groups and random oracles. The EP-AB-KEM can be used to instantiate our generic AB-GKE protocol.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Value Management (VM) has been proven to provide a structured framework, together with other supporting tools and techniques, that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. One of the major success factors of VM in achieving better project objectives for clients is through the provision of beneficial input by multi-disciplinary team members being involved in critical decision-making discussions during the early stage of construction projects. This paper describes a doctoral research proposal based on the application of VM in design and build construction projects, especially focusing on the design stage. The research aims to study the effects of implementing VM in design and build construction projects, in particular how well the methodology addresses issues related to cost overruns resulting from poor coordination and overlooking of critical constructability issues amongst team members in construction projects in Malaysia. It is proposed that through contractors’ early involvement during the design stage, combined with the use of the VM methodology, particularly as a decision-making tool, better optimization of construction cost can be achieved, thus promoting more efficient and effective constructability. The main methods used in this research involve a thorough literature study, semi-structured interviews, and a survey of major stakeholders, a detailed case study and a VM workshop and focus group discussions involving construction professionals in order to explore and possibly develop a framework and a specific methodology for the facilitating successful application of VM within design and build construction projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Value Management (VM) has been proven to provide a structured framework, together with supporting tools and techniques that facilitate effective decision-making in many types of projects, thus achieving ‘best value’ for clients. It is identified at International level as a natural career progression for the construction service provider and as an opportunity in developing leading-edge skills. The services offered by contractors and consultants in the construction sector have been expanding. In an increasingly competitive and global marketplace, firms are seeking ways to differentiate their services to ever more knowledgeable and demanding clients. The traditional demarcations have given way, and the old definition of what contractors, designers, engineers and quantity surveyors can, and cannot do in terms of their market offering has changed. Project management, design and cost and safety consultancy services, are being delivered by a diverse range of suppliers. Value management services have been developing in various sectors in industry; from manufacturing to the military and now construction. Given the growing evidence that VM has been successful in delivering value-for-money to the client, VM would appear to be gaining some momentum as an essential management tool in the Malaysian construction sector. The recently issued VM Circular 3/2009 by the Economic Planning Unit Malaysia (EPU) possibly marks a new beginning in public sector client acceptance on the strength of VM in construction. This paper therefore attempts to study the prospects of marketing the benefits of VM by construction service providers, and how it may provide an edge in an increasingly competitive Malaysian construction industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With increasing pressure to deliver environmentally friendly and socially responsible highway infrastructure projects, stakeholders are also putting significant focus on the early identification of financial viability and outcomes for these projects. Infrastructure development typically requires major capital input, which may cause serious financial constraints for investors. The push for sustainability has added new dimensions to the evaluation of highway projects, particularly on the cost front. Comprehensive analysis of the cost implications of implementing place sustainable measures in highway infrastructure throughout its lifespan is highly desirable and will become an essential part of the highway development process and a primary concern for decision makers. This paper discusses an ongoing research which seeks to identify cost elements and issues related to sustainable measures for highway infrastructure projects. Through life-cycle costing analysis (LCCA), financial implications of pursuing sustainability, which are highly concerned by the construction stakeholders, have been assessed to aid the decision making when contemplating the design, development and operation of highway infrastructure. An extensive literature review and evaluation of project reports from previous Australian highway projects was first conducted to reveal all potential cost elements. This provided the foundation for a questionnaire survey, which helped identify those specific issues and related costs that project stakeholders consider to be most critical in the Australian industry context. Through the survey, three key stakeholders in highway infrastructure development, namely consultants, contractors and government agencies, provided their views on the specific selection and priority ranking of the various categories. Findings of the survey are being integrated into proven LCCA models for further enhancement. A new LCCA model will be developed to assist the stakeholders to evaluate costs and investment decisions and reach optimum balance between financial viability and sustainability deliverables.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Trouble with Play is a radical departure from some of the ideas about play that are held dear by many in early childhood education. For many, play is considered essential to children's development and learning, and is often promoted as a universal and almost magical 'fix'. Although play does have many proven benefits for children, the authors show that play in the early years is not always innocent, fun and natural. Play can also be political and involve morals, ethics, values and power.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Now in its sixth edition, the Traffic Engineering Handbook continues to be a must have publication in the transportation industry, as it has been for the past 60 years. The new edition provides updated information for people entering the practice and for those already practicing. The handbook is a convenient desk reference, as well as an all in one source of principles and proven techniques in traffic engineering. Most chapters are presented in a new format, which divides the chapters into four areas-basics, current practice, emerging trends and information sources. Chapter topics include road users, vehicle characteristics, statistics, planning for operations, communications, safety, regulations, traffic calming, access management, geometrics, signs and markings, signals, parking, traffic demand, maintenance and studies. In addition, as the focus in transportation has shifted from project based to operations based, two new chapters have been added-"Planning for Operations" and "Managing Traffic Demand to Address Congestion: Providing Travelers with Choices." The Traffic Engineering Handbook continues to be one of the primary reference sources for study to become a certified Professional Traffic Operations Engineer™. Chapters are authored by notable and experienced authors, and reviewed and edited by a distinguished panel of traffic engineering experts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High reliability of railway power systems is one of the essential criteria to ensure quality and cost-effectiveness of railway services. Evaluation of reliability at system level is essential for not only scheduling maintenance activities, but also identifying reliability-critical components. Various methods to compute reliability on individual components or regularly structured systems have been developed and proven to be effective. However, they are not adequate for evaluating complicated systems with numerous interconnected components, such as railway power systems, and locating the reliability critical components. Fault tree analysis (FTA) integrates the reliability of individual components into the overall system reliability through quantitative evaluation and identifies the critical components by minimum cut sets and sensitivity analysis. The paper presents the reliability evaluation of railway power systems by FTA and investigates the impact of maintenance activities on overall reliability. The applicability of the proposed methods is illustrated by case studies in AC railways.