960 resultados para data sets


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis is a problematisation of the development and implementation of professional standards as the mechanism to enhance professionalism and teacher quality in the teaching force within Australia and, more specifically, Queensland. Drawing on tools from Foucauldian archaeological analysis, the dominant discourses of professionalism from the academic literature, Australian federal and state policy documents and narratives from Queensland teachers are examined. These data sets are then cross referenced, analysing the intersections and divergences between the different texts. Findings suggest that through policy, political strategy and derisory statements from various authoritative voices, the managerial discourse of professionalism through professional standards documents has been unduly privileged as a means of regulating teachers, despite the fact that teachers themselves do not share this dominant notion of professionalism. The teachers in this study proffer ‘new classical-practical professionalism’ as a counter discourse, or discourse of resistance, to managerialism. However, an application of Foucault’s theorisations on power-knowledge reveals that their spoken discourses mean they are in fact yielding to the discourse of professional standards as docile bodies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Research over the last two decades has significantly increased our understanding of the evolutionary position of the insects among other arthropods, and the relationships among the insect Orders. Many of these insights have been established through increasingly sophisticated analyses of DNA sequence data from a limited number of genes. Recent results have established the relationships of the Holometabola, but relationships among the hemimetabolous orders have been more difficult to elucidate. A strong consensus on the relationships among the Palaeoptera (Ephemeroptera and Odonata) and their relationship to the Neoptera has not emerged with all three possible resolutions supported by different data sets. While polyneopteran relationships generally have resisted significant resolution, it is now clear that termites, Isoptera, are nested within the cockroaches, Blattodea. The newly discovered order Mantophasmatodea is difficult to place with the balance of studies favouring Grylloblattodea as sister-group. While some studies have found the paraneopteran orders (Hemiptera, Thysanoptera, Phthiraptera and Psocoptera) monophyletic, evidence suggests that parasitic lice (Phthiraptera) have evolved from groups within the book and bark lice (Psocoptera), and may represent parallel evolutions of parasitism within two major louse groups. Within Holometabola, it is now clear that Hymenoptera are the sister to the other orders, that, in turn are divided into two clades, the Neuropteroidea (Coleoptera, Neuroptera and relatives) and the Mecopterida (Trichoptera, Lepidoptera, Diptera and their relatives). The enigmatic order Strepsiptera, the twisted wing insects, have now been placed firmly near Coleoptera, rejecting their close relationship to Diptera that was proposed some 15years ago primarily based on ribosomal DNA data. Phylogenomic-scale analyses are just beginning to be focused on the relationships of the insect orders, and this is where we expect to see resolution of palaeopteran and polyneopteran relationships. Future research will benefit from greater coordination between intra and inter-ordinal analyses. This will maximise the opportunities for appropriate outgroup choice at the intraordinal level and provide the background knowledge for the interordinal analyses to span the maximum phylogenetic scope within groups.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For facial expression recognition systems to be applicable in the real world, they need to be able to detect and track a previously unseen person's face and its facial movements accurately in realistic environments. A highly plausible solution involves performing a "dense" form of alignment, where 60-70 fiducial facial points are tracked with high accuracy. The problem is that, in practice, this type of dense alignment had so far been impossible to achieve in a generic sense, mainly due to poor reliability and robustness. Instead, many expression detection methods have opted for a "coarse" form of face alignment, followed by an application of a biologically inspired appearance descriptor such as the histogram of oriented gradients or Gabor magnitudes. Encouragingly, recent advances to a number of dense alignment algorithms have demonstrated both high reliability and accuracy for unseen subjects [e.g., constrained local models (CLMs)]. This begs the question: Aside from countering against illumination variation, what do these appearance descriptors do that standard pixel representations do not? In this paper, we show that, when close to perfect alignment is obtained, there is no real benefit in employing these different appearance-based representations (under consistent illumination conditions). In fact, when misalignment does occur, we show that these appearance descriptors do work well by encoding robustness to alignment error. For this work, we compared two popular methods for dense alignment-subject-dependent active appearance models versus subject-independent CLMs-on the task of action-unit detection. These comparisons were conducted through a battery of experiments across various publicly available data sets (i.e., CK+, Pain, M3, and GEMEP-FERA). We also report our performance in the recent 2011 Facial Expression Recognition and Analysis Challenge for the subject-independent task.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents an approach to modelling the resilience of a generic (potable) water supply system. The system is contextualized as a meta-system consisting of three subsystems to represent the natural catchment, the water treatment plant and the water distribution infrastructure for urban use. An abstract mathematical model of the meta-system is disaggregated progressively to form a cascade of equations forming a relational matrix of models. This allows the investigation of commonly implicit relationships between various operational components within the meta system, the in-depth understanding of specific system components and influential factors and the incorporation of explicit disturbances to explore system behaviour. Consequently, this will facilitate long-term decision making to achieve sustainable solutions for issues such as, meeting a growing demand or managing supply-side influences in the meta-system under diverse water availability regimes. This approach is based on the hypothesis that the means to achieve resilient supply of water may be better managed by modelling the effects of changes at specific levels that have a direct or in some cases indirect impact on higher-order outcomes. Additionally, the proposed strategy allows the definition of approaches to combine disparate data sets to synthesise previously missing or incomplete higher-order information, a scientifically robust means to define and carry out meta-analyses using knowledge from diverse yet relatable disciplines relevant to different levels of the system and for enhancing the understanding of dependencies and inter-dependencies of variable factors at various levels across the meta-system. The proposed concept introduces an approach for modelling a complex infrastructure system as a meta system which consists of a combination of bio-ecological, technical and socio-technical subsystems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we discuss whether corruption is contagious and whether conditional cooperation matters. We use the notion of “conditional corruption” for these effects. We analyze whether the justifiability to be corrupt is influenced by the perceived activities of others. Moreover, we also explore whether – and to what extent – group dynamics or socialization and past experiences affect corruption. We present evidence using two data sets at the micro level and a large macro level international panel data set. The results indicate that the willingness to engage in corruption is influenced by the perceived activities of peers and other individuals. Moreover, the panel data set at the macro level indicates that the past level of corruption has a strong impact on the current corruption level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper, we address the puzzle of the relationship between age and happiness. Whilst the majority of psychologists have concluded there is not much of a relationship at all, the economic literature has unearthed a possible U-shape relationship with the minimum level of satisfaction occurring in middle age (35–50). In this paper, we look for a U-shape in three panel data sets, the German Socioeconomic Panel (GSOEP), the British Household Panel Survey (BHPS) and the Household Income Labour Dynamics Australia (HILDA). We find that the raw data mainly supports a wave-like shape that only weakly looks U-shaped for the 20–60 age range. That weak U-shape in middle age becomes more pronounced when allowing for socio-economic variables. When we then take account of selection effects via fixed-effects, however, the dominant age-effect in all three panels is a strong happiness increase around the age of 60 followed by a major decline after 75, with the U-shape in middle age disappearing such that there is almost no change in happiness between the age of 20 and 50.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

3D models of long bones are being utilised for a number of fields including orthopaedic implant design. Accurate reconstruction of 3D models is of utmost importance to design accurate implants to allow achieving a good alignment between two bone fragments. Thus for this purpose, CT scanners are employed to acquire accurate bone data exposing an individual to a high amount of ionising radiation. Magnetic resonance imaging (MRI) has been shown to be a potential alternative to computed tomography (CT) for scanning of volunteers for 3D reconstruction of long bones, essentially avoiding the high radiation dose from CT. In MRI imaging of long bones, the artefacts due to random movements of the skeletal system create challenges for researchers as they generate inaccuracies in the 3D models generated by using data sets containing such artefacts. One of the defects that have been observed during an initial study is the lateral shift artefact occurring in the reconstructed 3D models. This artefact is believed to result from volunteers moving the leg during two successive scanning stages (the lower limb has to be scanned in at least five stages due to the limited scanning length of the scanner). As this artefact creates inaccuracies in the implants designed using these models, it needs to be corrected before the application of 3D models to implant design. Therefore, this study aimed to correct the lateral shift artefact using 3D modelling techniques. The femora of five ovine hind limbs were scanned with a 3T MRI scanner using a 3D vibe based protocol. The scanning was conducted in two halves, while maintaining a good overlap between them. A lateral shift was generated by moving the limb several millimetres between two scanning stages. The 3D models were reconstructed using a multi threshold segmentation method. The correction of the artefact was achieved by aligning the two halves using the robust iterative closest point (ICP) algorithm, with the help of the overlapping region between the two. The models with the corrected artefact were compared with the reference model generated by CT scanning of the same sample. The results indicate that the correction of the artefact was achieved with an average deviation of 0.32 ± 0.02 mm between the corrected model and the reference model. In comparison, the model obtained from a single MRI scan generated an average error of 0.25 ± 0.02 mm when compared with the reference model. An average deviation of 0.34 ± 0.04 mm was seen when the models generated after the table was moved were compared to the reference models; thus, the movement of the table is also a contributing factor to the motion artefacts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: To derive preference-based measures from various condition-specific descriptive health-related quality of life (HRQOL) measures. A general 2-stage method is evolved: 1) an item from each domain of the HRQOL measure is selected to form a health state classification system (HSCS); 2) a sample of health states is valued and an algorithm derived for estimating the utility of all possible health states. The aim of this analysis was to determine whether confirmatory or exploratory factor analysis (CFA, EFA) should be used to derive a cancer-specific utility measure from the EORTC QLQ-C30. Methods: Data were collected with the QLQ-C30v3 from 356 patients receiving palliative radiotherapy for recurrent or metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter based on a conceptual model (the established domain structure of the QLQ-C30: physical, role, emotional, social and cognitive functioning, plus several symptoms) and clinical considerations (views of both patients and clinicians about issues relevant to HRQOL in cancer). The dimensions determined by each method were then subjected to item response theory, including Rasch analysis. Results: CFA results generally supported the proposed conceptual model, with residual correlations requiring only minor adjustments (namely, introduction of two cross-loadings) to improve model fit (increment χ2(2) = 77.78, p < .001). Although EFA revealed a structure similar to the CFA, some items had loadings that were difficult to interpret. Further assessment of dimensionality with Rasch analysis aligned the EFA dimensions more closely with the CFA dimensions. Three items exhibited floor effects (>75% observation at lowest score), 6 exhibited misfit to the Rasch model (fit residual > 2.5), none exhibited disordered item response thresholds, 4 exhibited DIF by gender or cancer site. Upon inspection of the remaining items, three were considered relatively less clinically important than the remaining nine. Conclusions: CFA appears more appropriate than EFA, given the well-established structure of the QLQ-C30 and its clinical relevance. Further, the confirmatory approach produced more interpretable results than the exploratory approach. Other aspects of the general method remain largely the same. The revised method will be applied to a large number of data sets as part of the international and interdisciplinary project to develop a multi-attribute utility instrument for cancer (MAUCa).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cartilage defects heal imperfectly and osteoarthritic changes develop frequently as a result. Although the existence of specific behaviours of chondrocytes derived from various depth-related zones in vitro has been known for over 20 years, only a relatively small body of in vitro studies has been performed with zonal chondrocytes and current clinical treatment strategies do not reflect these native depth-dependent (zonal) differences. This is surprising since mimicking the zonal organization of articular cartilage in neo-tissue by the use of zonal chondrocyte subpopulations could enhance the functionality of the graft. Although some research groups including our own have made considerable progress in tailoring culture conditions using specific growth factors and biomechanical loading protocols, we conclude that an optimal regime has not yet been determined. Other unmet challenges include the lack of specific zonal cell sorting protocols and limited amounts of cells harvested per zone. As a result, the engineering of functional tissue has not yet been realized and no long-term in vivo studies using zonal chondrocytes have been described. This paper critically reviews the research performed to date and outlines our view of the potential future significance of zonal chondrocyte populations in regenerative approaches for the treatment of cartilage defects. Secondly, we briefly discuss the capabilities of additive manufacturing technologies that can not only create patient-specific grafts directly from medical imaging data sets but could also more accurately reproduce the complex 3D zonal extracellular matrix architecture using techniques such as hydrogel-based cell printing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The average structure (CI) of a volcanic plagioclase megacryst with composition Ano, from the Hogarth Ranges, Australia, has been determined using three-dimensional, singlecrystal neutron and X-ray diffraction data. Least squaresr efinements, incorporating anisotropic thermal motion of all atoms and an extinction correction, resulted in weighted R factors (based on intensities) of 0.076 and 0.056, respectively, for the neutron and X-ray data. Very weak e reflections could be detected in long-exposure X-ray and electron diffraction photographs of this crystal, but the refined average structure is believed to be unaffected by the presence of such a weak superstructure. The ratio of the scattering power of Na to that of Ca is different for X ray and neutron radiation, and this radiation-dependence of scattering power has been used to determine the distribution of Na and Ca over a split-atom M site (two sites designated M' and M") in this Ano, plagioclase. Relative peak-height ratios M'/M", revealed in difference Fourier sections calculated from neutron and X-ray data, formed the basis for the cation-distribution analysis. As neutron and X-ray data sets were directly compared in this analysis, it was important that systematic bias between refined neutron and X-ray positional parameters could be demonstrated to be absent. In summary, with an M-site model constrained only by the electron-microprobedetermined bulk composition of the crystal, the following values were obtained for the M-site occupanciesN: ar, : 0.29(7),N ar. : 0.23(7),C ar, : 0.15(4),a nd Car" : 0.33(4). These results indicate that restrictive assumptions about M sites, on which previous plagioclase refinements have been based, are not applicable to this Ano, and possibly not to the entire compositional range. T-site ordering determined by (T-O) bond-length variation-t,o : 0.51(l), trm = t2o = t2m = 0.32(l)-is weak, as might be expectedf rom the volcanic origin of this megacryst.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The behaviour of single installations of solar energy systems is well understood; however, what happens at an aggregated location, such as a distribution substation, when output of groups of installations cumulate is not so well understood. This paper considers groups of installations attached to distributions substations on which the load is primarily commercial and industrial. Agent-based modelling has been used to model the physical electrical distribution system and the behaviour of equipment outputs towards the consumer end of the network. The paper reports the approach used to simulate both the electricity consumption of groups of consumers and the output of solar systems subject to weather variability with the inclusion of cloud data from the Bureau of Meteorology (BOM). The data sets currently used are for Townsville, North Queensland. The initial characteristics that indicate whether solar installations are cost effective from an electricity distribution perspective are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology (IT) infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry’s technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry’s services to be offered through cloud-based “apps.”

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract. For interactive systems, recognition, reproduction, and generalization of observed motion data are crucial for successful interaction. In this paper, we present a novel method for analysis of motion data that we refer to as K-OMM-trees. K-OMM-trees combine Ordered Means Models (OMMs) a model-based machine learning approach for time series with an hierarchical analysis technique for very large data sets, the K-tree algorithm. The proposed K-OMM-trees enable unsupervised prototype extraction of motion time series data with hierarchical data representation. After introducing the algorithmic details, we apply the proposed method to a gesture data set that includes substantial inter-class variations. Results from our studies show that K-OMM-trees are able to substantially increase the recognition performance and to learn an inherent data hierarchy with meaningful gesture abstractions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Despite the compelling case for moving towards cloud computing, the upstream oil & gas industry faces several technical challenges—most notably, a pronounced emphasis on data security, a reliance on extremely large data sets, and significant legacy investments in information technology infrastructure—that make a full migration to the public cloud difficult at present. Private and hybrid cloud solutions have consequently emerged within the industry to yield as much benefit from cloud-based technologies as possible while working within these constraints. This paper argues, however, that the move to private and hybrid clouds will very likely prove only to be a temporary stepping stone in the industry's technological evolution. By presenting evidence from other market sectors that have faced similar challenges in their journey to the cloud, we propose that enabling technologies and conditions will probably fall into place in a way that makes the public cloud a far more attractive option for the upstream oil & gas industry in the years ahead. The paper concludes with a discussion about the implications of this projected shift towards the public cloud, and calls for more of the industry's services to be offered through cloud-based “apps.”