893 resultados para Respect for humanity
Resumo:
Background Evolutionary biologists are often misled by convergence of morphology and this has been common in the study of bird evolution. However, the use of molecular data sets have their own problems and phylogenies based on short DNA sequences have the potential to mislead us too. The relationships among clades and timing of the evolution of modern birds (Neoaves) has not yet been well resolved. Evidence of convergence of morphology remain controversial. With six new bird mitochondrial genomes (hummingbird, swift, kagu, rail, flamingo and grebe) we test the proposed Metaves/Coronaves division within Neoaves and the parallel radiations in this primary avian clade. Results Our mitochondrial trees did not return the Metaves clade that had been proposed based on one nuclear intron sequence. We suggest that the high number of indels within the seventh intron of the β-fibrinogen gene at this phylogenetic level, which left a dataset with not a single site across the alignment shared by all taxa, resulted in artifacts during analysis. With respect to the overall avian tree, we find the flamingo and grebe are sister taxa and basal to the shorebirds (Charadriiformes). Using a novel site-stripping technique for noise-reduction we found this relationship to be stable. The hummingbird/swift clade is outside the large and very diverse group of raptors, shore and sea birds. Unexpectedly the kagu is not closely related to the rail in our analysis, but because neither the kagu nor the rail have close affinity to any taxa within this dataset of 41 birds, their placement is not yet resolved. Conclusion Our phylogenetic hypothesis based on 41 avian mitochondrial genomes (13,229 bp) rejects monophyly of seven Metaves species and we therefore conclude that the members of Metaves do not share a common evolutionary history within the Neoaves.
Resumo:
A traditional approach centred on weekly lectures, perhaps supported by a tutorial programme, still predominates in modern legal education in Australia. This approach tends to focus on the transmission of knowledge about legal rules and doctrine to students who adopt a largely passive role. Criticisms of the traditional approach have led to law schools expanding their curricula to include the teaching of skills, including the skill of negotiation and an appreciation of legal ethics and professional responsibility. However, in a climate of limited government funding for law schools in Australia, innovation in legal education remains a challenge. This paper considers the successful use of Second Life machinima in two programs, Air Gondwana and Entry into Valhalla and their part in the creation of engaging, effective learning environments. These programs not only engage students in active learning but also facilitate flexibility in their studies and other benefits. The programs yield important lessons concerning the use of machinima innovations in curricula, not only for academics involved in legal education but also those in other disciplines, especially those that rely on traditional passive lectures in their teaching and learning approaches.
Resumo:
Additive manufacturing techniques offer the potential to fabricate organized tissue constructs to repair or replace damaged or diseased human tissues and organs. Using these techniques, spatial variations of cells along multiple axes with high geometric complexity in combination with different biomaterials can be generated. The level of control offered by these computer-controlled technologies to design and fabricate tissues will accelerate our understanding of the governing factors of tissue formation and function. Moreover, it will provide a valuable tool to study the effect of anatomy on graft performance. In this review, we discuss the rationale for engineering tissues and organs by combining computer-aided design with additive manufacturing technologies that encompass the simultaneous deposition of cells and materials. Current strategies are presented, particularly with respect to limitations due to the lack of suitable polymers, and requirements to move the current concepts to practical application.
Resumo:
During nutrition intervention programs, some form of dietary assessment is usually necessary. This dietary assessment can be for: initial screening; development of appropriate programs and activities; or, evaluation. Established methods of dietary assessment are not always practical, nor cost effective in such interventions, therefore an abbreviated dietary assessment tool is needed. The Queensland Nutrition Project developed such a tool for male Blue Collar Workers, the Food Behaviour Questionnaire, consisting of 27 food behaviour related questions. This tool has been validated in a sample of 23 men, through full dietary assessment obtained via food frequency questionnaires and 24 hour dietary recalls. Those questions which correlated poorly with the full dietary assessment were deleted from the tool. In all, 13 questions was all that was required to distinguish between high and low dietary intakes of particular nutrients. Three questions when combined had correlations with refined sugar between 0.617 and 0.730 (p<0.005); four questions when combined had correlations with dietary fibre as percentage of energy of 0.45 (p<0.05); five questions when combined had a correlation with total fat of 0.499 (p<0.05); and, 4 questions when combined had a correlation with saturated fat of between 0.451 and 0.589 (p<0.05). A significant correlation could not be found for food behaviour questions with respect to dietary sodium. Correlations for fat as a function of energy could not be found.
Resumo:
Nowadays, Workflow Management Systems (WfMSs) and, more generally, Process Management Systems (PMPs) are process-aware Information Systems (PAISs), are widely used to support many human organizational activities, ranging from well-understood, relatively stable and structures processes (supply chain management, postal delivery tracking, etc.) to processes that are more complicated, less structured and may exhibit a high degree of variation (health-care, emergency management, etc.). Every aspect of a business process involves a certain amount of knowledge which may be complex depending on the domain of interest. The adequate representation of this knowledge is determined by the modeling language used. Some processes behave in a way that is well understood, predictable and repeatable: the tasks are clearly delineated and the control flow is straightforward. Recent discussions, however, illustrate the increasing demand for solutions for knowledge-intensive processes, where these characteristics are less applicable. The actors involved in the conduct of a knowledge-intensive process have to deal with a high degree of uncertainty. Tasks may be hard to perform and the order in which they need to be performed may be highly variable. Modeling knowledge-intensive processes can be complex as it may be hard to capture at design-time what knowledge is available at run-time. In realistic environments, for example, actors lack important knowledge at execution time or this knowledge can become obsolete as the process progresses. Even if each actor (at some point) has perfect knowledge of the world, it may not be certain of its beliefs at later points in time, since tasks by other actors may change the world without those changes being perceived. Typically, a knowledge-intensive process cannot be adequately modeled by classical, state of the art process/workflow modeling approaches. In some respect there is a lack of maturity when it comes to capturing the semantic aspects involved, both in terms of reasoning about them. The main focus of the 1st International Workshop on Knowledge-intensive Business processes (KiBP 2012) was investigating how techniques from different fields, such as Artificial Intelligence (AI), Knowledge Representation (KR), Business Process Management (BPM), Service Oriented Computing (SOC), etc., can be combined with the aim of improving the modeling and the enactment phases of a knowledge-intensive process. The 1st International Workshop on Knowledge-intensive Business process (KiBP 2012) was held as part of the program of the 2012 Knowledge Representation & Reasoning International Conference (KR 2012) in Rome, Italy, in June 2012. The workshop was hosted by the Dipartimento di Ingegneria Informatica, Automatica e Gestionale Antonio Ruberti of Sapienza Universita di Roma, with financial support of the University, through grant 2010-C26A107CN9 TESTMED, and the EU Commission through the projects FP7-25888 Greener Buildings and FP7-257899 Smart Vortex. This volume contains the 5 papers accepted and presented at the workshop. Each paper was reviewed by three members of the internationally renowned Program Committee. In addition, a further paper was invted for inclusion in the workshop proceedings and for presentation at the workshop. There were two keynote talks, one by Marlon Dumas (Institute of Computer Science, University of Tartu, Estonia) on "Integrated Data and Process Management: Finally?" and the other by Yves Lesperance (Department of Computer Science and Engineering, York University, Canada) on "A Logic-Based Approach to Business Processes Customization" completed the scientific program. We would like to thank all the Program Committee members for the valuable work in selecting the papers, Andrea Marrella for his valuable work as publication and publicity chair of the workshop, and Carola Aiello and the consulting agency Consulta Umbria for the organization of this successful event.
Resumo:
A holistic study of the composition of the basalt groundwaters of the Atherton Tablelands region in Queensland, Australia was undertaken to elucidate possible mechanisms for the evolution of these very low salinity, silica- and bicarbonate-rich groundwaters. It is proposed that aluminosilicate mineral weathering is the major contributing process to the overall composition of the basalt groundwaters. The groundwaters approach equilibrium with respect to the primary minerals with increasing pH and are mostly in equilibrium with the major secondary minerals (kaolinite and smectite), and other secondary phases such as goethite, hematite, and gibbsite, which are common accessory minerals in the Atherton basalts. The mineralogy of the basalt rocks, which has been examined using X-ray diffraction and whole rock geochemistry methods, supports the proposed model for the hydrogeochemical evolution of these groundwaters: precipitation + CO 2 (atmospheric + soil) + pyroxene + feldspars + olivine yields H 4SiO 4, HCO 3 -, Mg 2+, Na +, Ca 2+ + kaolinite and smectite clays + amorphous or crystalline silica + accessory minerals (hematite, goethite, gibbsite, carbonates, zeolites, and pyrite). The variations in the mineralogical content of these basalts also provide insights into the controls on groundwater storage and movement in this aquifer system. The fresh and weathered vesicular basalts are considered to be important in terms of zones of groundwater occurrence, while the fractures in the massive basalt are important pathways for groundwater movement.
Resumo:
Appearance-based localization can provide loop closure detection at vast scales regardless of accumulated metric error. However, the computation time and memory requirements of current appearance-based methods scale not only with the size of the environment but also with the operation time of the platform. Additionally, repeated visits to locations will develop multiple competing representations, which will reduce recall performance over time. These properties impose severe restrictions on long-term autonomy for mobile robots, as loop closure performance will inevitably degrade with increased operation time. In this paper we present a graphical extension to CAT-SLAM, a particle filter-based algorithm for appearance-based localization and mapping, to provide constant computation and memory requirements over time and minimal degradation of recall performance during repeated visits to locations. We demonstrate loop closure detection in a large urban environment with capped computation time and memory requirements and performance exceeding previous appearance-based methods by a factor of 2. We discuss the limitations of the algorithm with respect to environment size, appearance change over time and applications in topological planning and navigation for long-term robot operation.
Resumo:
There is a tax amendment bill which will be debated. The Government has promised to outline its plan for the reform of the taxation system sometime this year. The plans appear to go beyond the mere introduction of some sort of goods and services tax to reform of the whole taxation system including fiscal relations with the States. Not for profit organisations will find their taxation environment will change. Governments are reluctant to permit exemptions to a GST style arrangements. GST trade offs such as reduced income tax rates and abolishing indirect taxes are useless to nonprofit organisations, as many are already exempt from such imposts. Administrative changes to tax collections may also have an impact. If the government decides to make an individual PAYE taxpayer return optional in exchange for no or standard deductions, this may have an effect on fundraising. The FBT and salary packaging schemes that not for profit organisations use will be under intense scrutiny. A regionalisation of the ATO along the successful model of the ASC would see discrete areas such as not for profit exemptions being centralised in one regional office for the whole of Australia. For example the Tasmanian ASC Office has the responsibility for much work in respect of corporate charities and not for profit companies.
Resumo:
The development of the Learning and Teaching Academic Standards Statement for Architecture (the Statement) centred on requirements for the Master of Architecture and proceeded alongside similar developments in the building and construction discipline under the guidance and support of the Australian Deans of Built Environment and Design (ADBED). Through their representation of Australian architecture programs, ADBED have provided high-level leadership for the Learning and Teaching Academic Standards Project in Architecture (LTAS Architecture). The threshold learning outcomes (TLOs), the description of the nature and extent of the discipline, and accompanying notes were developed through wide consultation with the discipline and profession nationally. They have been considered and debated by ADBED on a number of occasions and have, in their fi nal form, been strongly endorsed by the Deans. ADBED formed the core of the Architecture Reference Group (chaired by an ADBED member) that drew together representatives of every peak organisation for the profession and discipline in Australia. The views of the architectural education community and profession have been provided both through individual submissions and the voices of a number of peak bodies. Over two hundred individuals from the practising profession, the academic workforce and the student cohort have worked together to build consensus about the capabilities expected of a graduate of an Australian Master of Architecture degree. It was critical from the outset that the Statement should embrace the wisdom of the greater ‘tribe’, should ensure that graduates of the Australian Master of Architecture were eligible for professional registration and, at the same time, should allow for scope and diversity in the shape of Australian architectural education. A consultation strategy adopted by the Discipline Scholar involved meetings and workshops in Perth, Melbourne, Sydney, Canberra and Brisbane. Stakeholders from all jurisdictions and most universities participated in the early phases of consultation through a series of workshops that concluded late in October 2010. The Draft Architecture Standards Statement was formed from these early meetings and consultation in respect of that document continued through early 2011. This publication represents the outcomes of work to establish an agreed standards statement for the Master of Architecture. Significant further work remains to ensure the alignment of professional accreditation and recognition procedures with emerging regulatory frameworks cascading from the establishment of the Tertiary Education Quality and Standards Agency (TEQSA). The Australian architecture community hopes that mechanisms can be found to integrate TEQSA’s quality assurance purpose with well-established and understood systems of professional accreditation to ensure the good standing of Australian architectural education into the future. The work to build renewed and integrated quality assurance processes and to foster the interests of this project will continue, for at least the next eighteen months, under the auspices of Australian Learning and Teaching Council (ALTC)-funded Architecture Discipline Network (ADN), led by ADBED and Queensland University of Technology. The Discipline Scholar gratefully acknowledges the generous contributions given by those in stakeholder communities to the formulation of the Statement. Professional and academic colleagues have travelled and gathered to shape the Standards Statement. Debate has been vigorous and spirited and the Statement is rich with the purpose, critical thinking and good judgement of the Australian architectural education community. The commitments made to the processes that have produced this Statement reflect a deep and abiding interest by the constituency in architectural education. This commitment bodes well for the vibrancy and productivity of the emergent Architecture Discipline Network (ADN). Endorsement, in writing, was received from the Australian Institute of Architects National Education Committee (AIA NEC): The National Education Committee (NEC) of the Australian Institute of Architects thank you for your work thus far in developing the Learning and Teaching Academic Standards for Architecture In particular, we acknowledge your close consultation with the NEC on the project along with a comprehensive cross-section of the professional and academic communities in architecture. The TLOs with the nuanced levels of capacities – to identify, develop, explain, demonstrate etc – are described at an appropriate level to be understood as minimum expectations for a Master of Architecture graduate. The Architects Accreditation Council of Australia (AACA) has noted: There is a clear correlation between the current processes for accreditation and what may be the procedures in the future following the current review. The requirement of the outcomes as outlined in the draft paper to demonstrate capability is an appropriate way of expressing the measure of whether the learning outcomes have been achieved. The measure of capability as described in the outcome statements is enhanced with explanatory descriptions in the accompanying notes.
Resumo:
This paper describes a novel method for determining the extrinsic calibration parameters between 2D and 3D LIDAR sensors with respect to a vehicle base frame. To recover the calibration parameters we attempt to optimize the quality of a 3D point cloud produced by the vehicle as it traverses an unknown, unmodified environment. The point cloud quality metric is derived from Rényi Quadratic Entropy and quantifies the compactness of the point distribution using only a single tuning parameter. We also present a fast approximate method to reduce the computational requirements of the entropy evaluation, allowing unsupervised calibration in vast environments with millions of points. The algorithm is analyzed using real world data gathered in many locations, showing robust calibration performance and substantial speed improvements from the approximations.
Resumo:
An increasing number of researchers have hypothesized that ozone may be involved in the particle formation processes that occur during printing, however no studies have investigated this further. In the current study, this hypothesis was tested in a chamber study by adding supplemental ozone to the chamber after a print job without measurable ozone emissions. Subsequent particle number concentration and size distribution measurements showed that new particles were formed minutes after the addition of ozone. The results demonstrated that ozone did react with printer-generated volatile organic compounds (VOCs) to form secondary organic aerosols (SOAs). The hypothesis was further confirmed by the observation of correlations among VOCs, ozone, and particles concentrations during a print job with measurable ozone emissions. The potential particle precursors were identified by a number of furnace tests, which suggested that squalene and styrene were the most likely SOA precursors with respect to ozone. Overall, this study significantly improved scientific understanding of the formation mechanisms of printer-generated particles, and highlighted the possible SOA formation potential of unsaturated nonterpene organic compounds by ozone-initiated reactions in the indoor environment. © 2011 American Chemical Society.
Resumo:
For many years, computer vision has lured researchers with promises of a low-cost, passive, lightweight and information-rich sensor suitable for navigation purposes. The prime difficulty in vision-based navigation is that the navigation solution will continually drift with time unless external information is available, whether it be cues from the appearance of the scene, a map of features (whether built online or known a priori), or from an externally-referenced sensor. It is not merely position that is of interest in the navigation problem. Attitude (i.e. the angular orientation of a body with respect to a reference frame) is integral to a visionbased navigation solution and is often of interest in its own right (e.g. flight control). This thesis examines vision-based attitude estimation in an aerospace environment, and two methods are proposed for constraining drift in the attitude solution; one through a novel integration of optical flow and the detection of the sky horizon, and the other through a loosely-coupled integration of Visual Odometry and GPS position measurements. In the first method, roll angle, pitch angle and the three aircraft body rates are recovered though a novel method of tracking the horizon over time and integrating the horizonderived attitude information with optical flow. An image processing front-end is used to select several candidate lines in a image that may or may not correspond to the true horizon, and the optical flow is calculated for each candidate line. Using an Extended Kalman Filter (EKF), the previously estimated aircraft state is propagated using a motion model and a candidate horizon line is associated using a statistical test based on the optical flow measurements and location of the horizon in the image. Once associated, the selected horizon line, along with the associated optical flow, is used as a measurement to the EKF. To evaluate the accuracy of the algorithm, two flights were conducted, one using a highly dynamic Uninhabited Airborne Vehicle (UAV) in clear flight conditions and the other in a human-piloted Cessna 172 in conditions where the horizon was partially obscured by terrain, haze and smoke. The UAV flight resulted in pitch and roll error standard deviations of 0.42° and 0.71° respectively when compared with a truth attitude source. The Cessna 172 flight resulted in pitch and roll error standard deviations of 1.79° and 1.75° respectively. In the second method for estimating attitude, a novel integrated GPS/Visual Odometry (GPS/VO) navigation filter is proposed, using a structure similar to a classic looselycoupled GPS/INS error-state navigation filter. Under such an arrangement, the error dynamics of the system are derived and a Kalman Filter is developed for estimating the errors in position and attitude. Through similar analysis to the GPS/INS problem, it is shown that the proposed filter is capable of recovering the complete attitude (i.e. pitch, roll and yaw) of the platform when subjected to acceleration not parallel to velocity for both the monocular and stereo variants of the filter. Furthermore, it is shown that under general straight line motion (e.g. constant velocity), only the component of attitude in the direction of motion is unobservable. Numerical simulations are performed to demonstrate the observability properties of the GPS/VO filter in both the monocular and stereo camera configurations. Furthermore, the proposed filter is tested on imagery collected using a Cessna 172 to demonstrate the observability properties on real-world data. The proposed GPS/VO filter does not require additional restrictions or assumptions such as platform-specific dynamics, map-matching, feature-tracking, visual loop-closing, gravity vector or additional sensors such as an IMU or magnetic compass. Since no platformspecific dynamics are required, the proposed filter is not limited to the aerospace domain and has the potential to be deployed in other platforms such as ground robots or mobile phones.
Resumo:
Purpose: Silicone hydrogel contact lenses (CLs) are becoming increasingly popular for daily wear (DW), extended wear (EW) and continuous wear (CW), due to their higher oxygen transmissibility compared to hydrogel CLs. The aim of this study was to investigate the clinical and subjective performance of asmofilcon A (Menicon Co., Ltd), a new surface treated silicone hydrogel CL, during 6-night EW over 6 months (M). Methods: A prospective, randomised, single-masked, monadic study was conducted. N=60 experienced DW soft CL wearers were randomly assigned to wear either asmofilcon A (test: Dk=129, water content (WC)=40%, Nanogloss surface treatment) or senofilcon A (control: Dk=103, WC=38%, PVP internal wetting agent, Vistakon, Johnson & Johnson Vision Care) CLs bilaterally for 6 M on an EW basis. A PHMB-preserved solution (Menicon Co., Ltd) was dispensed for CL care. Evaluations were conducted at CL delivery and after 1 week (W), 4 W, 3 M and 6 M of EW. At each visit, a range of objective and subjective clinical performance measures were assessed. Results: N=50 subjects (83%) successfully completed the study, with the majority of discontinuations due to loss to follow-up (n=3) or moving away/travel (n=5). N=2 subjects experienced adverse events; n=1 unilateral red eye with asmofilcon A and n=1 asymptomatic infiltrate with senofilcon A. There were no significant differences in high or low contrast distance visual acuity (HCDVA or LCDVA) between asmofilcon A and senofilcon A; however, LCDVA decreased significantly over time with both CL types (p<0.05). The two CL types did not vary significantly with respect to any of the objective and subjective measures assessed (p>0.05); CL fitting characteristics and CL surface measurements were very similar and mean bulbar and limbal redness measures were always less than grade 1.0. Superior palpebral conjunctival injection showed a statistically, but not clinically, significant increase over time with both CL types (p<0.05). Corneal staining did not vary significantly between asmofilcon A and senofilcon A (p>0.05), with low median gradings of less than 0.5 observed for all areas assessed. There were no solution-related staining reactions observed with either CL type. The asmofilcon A and senofilcon A CLs were both rated highly with respect to overall comfort, with medians of 14 or 15 hours of comfortable lens wearing time per day reported at each of the study visits (p>0.05). Conclusions: Over 6 months of EW, the asmofilcon A and senofilcon A CLs performed in a similar manner with respect to visual acuity, ocular health and CL performance measures. Some changes over time were observed with both CL types, including reduced LCDVA and increased superior palpebral injection, which warrant further investigation in longer-term EW studies. Asmofilcon A appeared to be equivalent in performance to senofilcon A.
Resumo:
In Australian Meat Holdings Pty Ltd v Sayers [2007] QSC 390 Daubney J considered the obligation imposed on a claimant under s 275 of the Workers’ Compensation and Rehabilitation Act 2003 (Qld) to provide the insurer with an authority to obtain information and documents. The decision leads to practical results.
Resumo:
We present a formalism for the analysis of sensitivity of nuclear magnetic resonance pulse sequences to variations of pulse sequence parameters, such as radiofrequency pulses, gradient pulses or evolution delays. The formalism enables the calculation of compact, analytic expressions for the derivatives of the density matrix and the observed signal with respect to the parameters varied. The analysis is based on two constructs computed in the course of modified density-matrix simulations: the error interrogation operators and error commutators. The approach presented is consequently named the Error Commutator Formalism (ECF). It is used to evaluate the sensitivity of the density matrix to parameter variation based on the simulations carried out for the ideal parameters, obviating the need for finite-difference calculations of signal errors. The ECF analysis therefore carries a computational cost comparable to a single density-matrix or product-operator simulation. Its application is illustrated using a number of examples from basic NMR spectroscopy. We show that the strength of the ECF is its ability to provide analytic insights into the propagation of errors through pulse sequences and the behaviour of signal errors under phase cycling. Furthermore, the approach is algorithmic and easily amenable to implementation in the form of a programming code. It is envisaged that it could be incorporated into standard NMR product-operator simulation packages.