96 resultados para Resolution of problems
Resumo:
Carrier phase ambiguity resolution over long baselines is challenging in BDS data processing. This is partially due to the variations of the hardware biases in BDS code signals and its dependence on elevation angles. We present an assessment of satellite-induced code bias variations in BDS triple-frequency signals and the ambiguity resolutions procedures involving both geometry-free and geometry-based models. First, since the elevation of a GEO satellite remains unchanged, we propose to model the single-differenced fractional cycle bias with widespread ground stations. Second, the effects of code bias variations induced by GEO, IGSO and MEO satellites on ambiguity resolution of extra-wide-lane, wide-lane and narrow-lane combinations are analyzed. Third, together with the IGSO and MEO code bias variations models, the effects of code bias variations on ambiguity resolution are examined using 30-day data collected over the baselines ranging from 500 to 2600 km in 2014. The results suggest that although the effect of code bias variations on the extra-wide-lane integer solution is almost ignorable due to its long wavelength, the wide-lane integer solutions are rather sensitive to the code bias variations. Wide-lane ambiguity resolution success rates are evidently improved when code bias variations are corrected. However, the improvement of narrow-lane ambiguity resolution is not obvious since it is based on geometry-based model and there is only an indirect impact on the narrow-lane ambiguity solutions.
Resumo:
Diffusion equations that use time fractional derivatives are attractive because they describe a wealth of problems involving non-Markovian Random walks. The time fractional diffusion equation (TFDE) is obtained from the standard diffusion equation by replacing the first-order time derivative with a fractional derivative of order α ∈ (0, 1). Developing numerical methods for solving fractional partial differential equations is a new research field and the theoretical analysis of the numerical methods associated with them is not fully developed. In this paper an explicit conservative difference approximation (ECDA) for TFDE is proposed. We give a detailed analysis for this ECDA and generate discrete models of random walk suitable for simulating random variables whose spatial probability density evolves in time according to this fractional diffusion equation. The stability and convergence of the ECDA for TFDE in a bounded domain are discussed. Finally, some numerical examples are presented to show the application of the present technique.
Resumo:
The validation of Computed Tomography (CT) based 3D models takes an integral part in studies involving 3D models of bones. This is of particular importance when such models are used for Finite Element studies. The validation of 3D models typically involves the generation of a reference model representing the bones outer surface. Several different devices have been utilised for digitising a bone’s outer surface such as mechanical 3D digitising arms, mechanical 3D contact scanners, electro-magnetic tracking devices and 3D laser scanners. However, none of these devices is capable of digitising a bone’s internal surfaces, such as the medullary canal of a long bone. Therefore, this study investigated the use of a 3D contact scanner, in conjunction with a microCT scanner, for generating a reference standard for validating the internal and external surfaces of a CT based 3D model of an ovine femur. One fresh ovine limb was scanned using a clinical CT scanner (Phillips, Brilliance 64) with a pixel size of 0.4 mm2 and slice spacing of 0.5 mm. Then the limb was dissected to obtain the soft tissue free bone while care was taken to protect the bone’s surface. A desktop mechanical 3D contact scanner (Roland DG Corporation, MDX 20, Japan) was used to digitise the surface of the denuded bone. The scanner was used with the resolution of 0.3 × 0.3 × 0.025 mm. The digitised surfaces were reconstructed into a 3D model using reverse engineering techniques in Rapidform (Inus Technology, Korea). After digitisation, the distal and proximal parts of the bone were removed such that the shaft could be scanned with a microCT (µCT40, Scanco Medical, Switzerland) scanner. The shaft, with the bone marrow removed, was immersed in water and scanned with a voxel size of 0.03 mm3. The bone contours were extracted from the image data utilising the Canny edge filter in Matlab (The Mathswork).. The extracted bone contours were reconstructed into 3D models using Amira 5.1 (Visage Imaging, Germany). The 3D models of the bone’s outer surface reconstructed from CT and microCT data were compared against the 3D model generated using the contact scanner. The 3D model of the inner canal reconstructed from the microCT data was compared against the 3D models reconstructed from the clinical CT scanner data. The disparity between the surface geometries of two models was calculated in Rapidform and recorded as average distance with standard deviation. The comparison of the 3D model of the whole bone generated from the clinical CT data with the reference model generated a mean error of 0.19±0.16 mm while the shaft was more accurate(0.08±0.06 mm) than the proximal (0.26±0.18 mm) and distal (0.22±0.16 mm) parts. The comparison between the outer 3D model generated from the microCT data and the contact scanner model generated a mean error of 0.10±0.03 mm indicating that the microCT generated models are sufficiently accurate for validation of 3D models generated from other methods. The comparison of the inner models generated from microCT data with that of clinical CT data generated an error of 0.09±0.07 mm Utilising a mechanical contact scanner in conjunction with a microCT scanner enabled to validate the outer surface of a CT based 3D model of an ovine femur as well as the surface of the model’s medullary canal.
Resumo:
The focus of this Handbook is on Australasia (a region loosely recognized as that which includes Australia and New Zealand plus nearby Pacific nations such as Papua New Guinea, Solomon Islands, Fiji, Tonga, Vanuatu, and the Samoan islands) science education and the scholarship that most closely supports this program. The reviews of the research situate what has been accomplished within a given field in Australasian rather than international context. The purpose therefore is to articulate and exhibit regional networks and trends that produced specific forms of science education. The thrust lies in identifying the roots of research programs and sketching trajectories—focusing the changing façade of problems and solutions within regional contexts. The approach allows readers review what has been done and accomplished, what is missing, and what might be done next.
Resumo:
This paper is the second in a pair that Lesh, English, and Fennewald will be presenting at ICME TSG 19 on Problem Solving in Mathematics Education. The first paper describes three shortcomings of past research on mathematical problem solving. The first shortcoming can be seen in the fact that knowledge has not accumulated – in fact it has atrophied significantly during the past decade. Unsuccessful theories continue to be recycled and embellished. One reason for this is that researchers generally have failed to develop research tools needed to reliably observe, document, and assess the development of concepts and abilities that they claim to be important. The second shortcoming is that existing theories and research have failed to make it clear how concept development (or the development of basic skills) is related to the development of problem solving abilities – especially when attention is shifted beyond word problems found in school to the kind of problems found outside of school, where the requisite skills and even the questions to be asked might not be known in advance. The third shortcoming has to do with inherent weaknesses in observational studies and teaching experiments – and the assumption that a single grand theory should be able to describe all of the conceptual systems, instructional systems, and assessment systems that strongly molded and shaped by the same theoretical perspectives that are being used to develop them. Therefore, this paper will describe theoretical perspectives and methodological tools that are proving to be effective to combat the preceding kinds or shortcomings. We refer to our theoretical framework as models & modeling perspectives (MMP) on problem solving (Lesh & Doerr, 2003), learning, and teaching. One of the main methodologies of MMP is called multi-tier design studies (MTD).
Resumo:
Objectives: To determine opinions and experiences of health professionals concerning the management of people with comorbid substance misuse and mental health disorders. Method: We conducted a survey of staff from mental health services and alcohol and drug services across Queensland. Survey items on problems and potential solutions had been generated by focus groups. Results: We analysed responses from 112 staff of alcohol and drug services and 380 mental health staff, representing a return of 79% and 42% respectively of the distributed surveys. One or more issues presented a substantial clinical management problem for 98% of respondents. Needs for increased facilities or services for dual disorder clients figured prominently. These included accommodation or respite care, work and rehabilitation programs, and support groups and resource materials for families. Needs for adolescent dual diagnosis services and after-hours alcohol and drug consultations were also reported. Each of these issues raised substantial problems for over 70% of staff. Another set of problems involved coordination of client care across mental health and alcohol and drug services, including disputes over duty of care. Difficulties with intersectoral liaison were more pronounced for alcohol and drug staff than for mental health. A majority of survey respondents identified 13 solutions as practical. These included routine screening for dual diagnosis at intake, and a range of proposals for closer intersectoral communication such as exchanging client information, developing shared treatment plans, conducting joint case conferences and offering consultation facilities. Conclusions: A wide range of problems for the management of comorbid disorders were identified. While solution of some problems will require resource allocation, many may be addressed by closer liaison between existing services.
Resumo:
This review explores the question whether chemometrics methods enhance the performance of electroanalytical methods. Electroanalysis has long benefited from the well-established techniques such as potentiometric titrations, polarography and voltammetry, and the more novel ones such as electronic tongues and noses, which have enlarged the scope of applications. The electroanalytical methods have been improved with the application of chemometrics for simultaneous quantitative prediction of analytes or qualitative resolution of complex overlapping responses. Typical methods include partial least squares (PLS), artificial neural networks (ANNs), and multiple curve resolution methods (MCR-ALS, N-PLS and PARAFAC). This review aims to provide the practising analyst with a broad guide to electroanalytical applications supported by chemometrics. In this context, after a general consideration of the use of a number of electroanalytical techniques with the aid of chemometrics methods, several overviews follow with each one focusing on an important field of application such as food, pharmaceuticals, pesticides and the environment. The growth of chemometrics in conjunction with electronic tongue and nose sensors is highlighted, and this is followed by an overview of the use of chemometrics for the resolution of complicated profiles for qualitative identification of analytes, especially with the use of the MCR-ALS methodology. Finally, the performance of electroanalytical methods is compared with that of some spectrophotometric procedures on the basis of figures-of-merit. This showed that electroanalytical methods can perform as well as the spectrophotometric ones. PLS-1 appears to be the method of practical choice if the %relative prediction error of not, vert, similar±10% is acceptable.
Resumo:
The binding interaction of the pesticide Isoprocarb and its degradation product, sodium 2-isopropylphenate, with bovine serum albumin (BSA) was studied by spectrofluorimetry under simulated physiological conditions. Both Isoprocarb and sodium 2-isopropylphenate quenched the intrinsic fluorescence of BSA. This quenching proceeded via a static mechanism. The thermodynamic parameters (ΔH°, ΔS° and ΔG°) obtained from the fluorescence data measured at two different temperatures showed that the binding of Isoprocarb to BSA involved hydrogen bonds and that of sodium 2-isopropylphenate to BSA involved hydrophobic and electrostatic interactions. Synchronous fluorescence spectroscopy of the interaction of BSA with either Isoprocarb or sodium 2-isopropylphenate showed that the molecular structure of the BSA was changed significantly, which is consistent with the known toxicity of the pesticide, i.e., the protein is denatured. The sodium 2-isopropylphenate, was estimated to be about 4–5 times more toxic than its parent, Isoprocarb. Synchronous fluorescence spectroscopy and the resolution of the three-way excitation–emission fluorescence spectra by the PARAFAC method extracted the relative concentration profiles of BSA, Isoprocab and sodium 2-isopropylphenate as a function of the added sodium 2-isopropylphenate. These profiles showed that the degradation product, sodium 2-isopropylphenate, displaced the pesticide in a competitive reaction with the BSA protein.
Resumo:
Unresolved painful emotional experiences such as bereavement, trauma and disturbances in core relationships, are common presenting problems for clients of psychodrama or psychotherapy more generally. Emotional pain is experienced as a shattering of the sense of self and disconnection from others and, when unresolved, produces avoidant responses which inhibit the healing process. There is agreement across therapeutic modalities that exposure to emotional experience can increase the efficacy of therapeutic interventions. Moreno proposes that the activation of spontaneity is the primary curative factor in psychodrama and that healing occurs when the protagonist (client) engages with his or her wider social system and develops greater flexibility in response to that system. An extensive case-report literature describes the application of the psychodrama method in healing unresolved painful emotional experiences, but there is limited empirical research to verify the efficacy of the method or to identify the processes that are linked to therapeutic change. The purpose of this current research was to construct a model of protagonist change processes that could extend psychodrama theory, inform practitioners’ therapeutic decisions and contribute to understanding the common factors in therapeutic change. Four studies investigated protagonist processes linked to in-session resolution of painful emotional experiences. Significant therapeutic events were analysed using recordings and transcripts of psychodrama enactments, protagonist and director recall interviews and a range of process and outcome measures. A preliminary study (3 cases) identified four themes that were associated with helpful therapeutic events: enactment, the working alliance with the director and with group members, emotional release or relief and social atom repair. The second study (7 cases) used Comprehensive Process Analysis (CPA) to construct a model of protagonists’ processes linked to in-session resolution. This model was then validated across four more cases in Study 3. Five meta-processes were identified: (i) a readiness to engage in the psychodrama process; (ii) re-experiencing and insight; (iii) activating resourcefulness; (iv) social atom repair with emotional release and (v) integration. Social atom repair with emotional release involved deeply experiencing a wished-for interpersonal experience accompanied by a free flowing release of previously restricted emotion and was most clearly linked to protagonists’ reports of reaching resolution and to post session improvements in interpersonal relationships and sense of self. Acceptance of self in the moment increased protagonists’ capacity to generate new responses within each meta-process and, in resolved cases, there was evidence of spontaneity developing over time. The fourth study tested Greenberg’s allowing and accepting painful emotional experience model as an alternative explanation of protagonist change. The findings of this study suggested that while the process of allowing emotional pain was present in resolved cases, Greenberg’s model was not sufficient to explain the processes that lead to in-session resolution. The protagonist’s readiness to engage and activation of resourcefulness appear to facilitate the transition from problem identification to emotional release. Furthermore, experiencing a reparative relationship was found to be central to the healing process. This research verifies that there can be in-session resolution of painful emotional experience during psychodrama and protagonists’ reports suggest that in-session resolution can heal the damage to the sense of self and the interpersonal disconnection that are associated with unresolved emotional pain. A model of protagonist change processes has been constructed that challenges the view of psychodrama as a primarily cathartic therapy, by locating the therapeutic experience of emotional release within the development of new role relationships. The five meta-processes which are described within the model suggest broad change principles which can assist practitioners to make sense of events as they unfold and guide their clinical decision making in the moment. Each meta-process was linked to specific post-session changes, so that the model can inform the development of therapeutic plans for individual clients and can aid communication for practitioners when a psychodrama intervention is used for a specific therapeutic purpose within a comprehensive program of therapy.
Resumo:
Near-infrared spectroscopy is a somewhat unutilised technique for the study of minerals. The technique has the ability to determine water content, hydroxyl groups and transition metals. In this paper we show the application of NIR spectroscopy to the study of selected minerals. The structure and spectral properties of two Cu-tellurite minerals graemite and teineite are compared with bismuth containing tellurite mineral smirnite by the application of NIR and IR spectroscopy. The position of Cu2+ bands and their splitting in the electronic spectra of tellurites are in conformity with octahedral geometry distortion. The spectral pattern of smirnite resembles graemite and the observed band at 10855 cm-1 with a weak shoulder at 7920 cm-1 is identified as due to Cu2+ ion. Any transition metal impurities may be identified by their bands in this spectral region. Three prominent bands observed in the region of 7200-6500 cm-1 are the overtones of water whilst the weak bands observed near 6200 cm-1in tellurites may be attributed to the hydrogen bonding between (TeO3)2- and H2O. The observation of a number of bands centred at around 7200 cm-1 confirms molecular water in tellurite minerals. A number of overlapping bands in the low wavenumbers 4500-4000 cm-1 is the result of combinational modes of (TeO3)2−ion. The appearance of the most intense peak at 5200 cm-1 with a pair of weak bands near 6000 cm-1 is a common feature in all the spectra and is related to the combinations of OH vibrations of water molecules, and bending vibrations ν2 (δ H2O). Bending vibrations δ H2O observed in the IR spectra shows a single band for smirnite at 1610 cm-1. The resolution of this band into number of components is evidenced for non-equivalent types of molecular water in graemite and teineite. (TeO3)2- stretching vibrations are characterized by three main absorptions at 1080, 780 and 695 cm-1.
Resumo:
With the increasing resolution of remote sensing images, road network can be displayed as continuous and homogeneity regions with a certain width rather than traditional thin lines. Therefore, road network extraction from large scale images refers to reliable road surface detection instead of road line extraction. In this paper, a novel automatic road network detection approach based on the combination of homogram segmentation and mathematical morphology is proposed, which includes three main steps: (i) the image is classified based on homogram segmentation to roughly identify the road network regions; (ii) the morphological opening and closing is employed to fill tiny holes and filter out small road branches; and (iii) the extracted road surface is further thinned by a thinning approach, pruned by a proposed method and finally simplified with Douglas-Peucker algorithm. Lastly, the results from some QuickBird images and aerial photos demonstrate the correctness and efficiency of the proposed process.
Resumo:
The requirements that an insured disclose all facts material to a transaction as well as not misrepresent material facts in the formation of an insurance contract are universal requirements of insurance law. The nature and extent of these obligations varies from one jurisdiction to the next. Disclosure in the insurance context is distinct from the general approach in commercial contracts, and in others between persons dealing at arm's length. It is the purpose of this article therefore to examine, on a comparative basis, the approaches adopted in the Anglo-Commonwealth context of England, Australia New Zealand and Singapore to the resolution of disclose issues in the formation of insurance contracts. Particular attention is focused on the Insurance Contracts Act 1984 (Australia) as this statue effects the most significant overhaul of the common law and the National Consumer Council in the United Kingdom has advocated that similar reforms be adopted.
Resumo:
Interdisciplinary studies are fundamental to the signature practices for the middle years of schooling. Middle years researchers claim that interdisciplinarity in teaching appropriately meets the needs of early adolescents by tying concepts together, providing frameworks for the relevance of knowledge, and demonstrating the linking of disparate information for solution of novel problems. Cognitive research is not wholeheartedly supportive of this position. Learning theorists assert that application of knowledge in novel situations for the solution of problems is actually dependent on deep discipline based understandings. The present research contrasts the capabilities of early adolescent students from discipline based and interdisciplinary based curriculum schooling contexts to successfully solve multifaceted real world problems. This will inform the development of effective management of middle years of schooling curriculum.
Resumo:
This paper draws on a recently completed report for the National Roundtable of Nonprofit Organisations (Lyons, North-Samardzic and Young, 2007) to determine the extent and dimensions of problems nonprofit organisations have in accessing the capital they need.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.