58 resultados para Characteristic Initial Value Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose This chapter investigates an episode where a supervising teacher on playground duty asks two boys to each give an account of their actions over an incident that had just occurred on some climbing equipment in the playground. Methodology This paper employs an ethnomethodological approach using conversation analysis. The data are taken from a corpus of video recorded interactions of children, aged 7-9 years, and the teacher, in school playgrounds during the lunch recess. Findings The findings show the ways that children work up accounts of their playground practices when asked by the teacher. The teacher initially provided interactional space for each child to give their version of the events. Ultimately, the teacher’s version of how to act in the playground became the sanctioned one. The children and the teacher formulated particular social orders of behavior in the playground through multi-modal devices, direct reported speech and scripts. Such public displays of talk work as socialization practices that frame teacher-sanctioned morally appropriate actions in the playground. Value of paper This chapter shows the pervasiveness of the teacher’s social order, as she presented an institutional social order of how to interact in the playground, showing clearly the disjunction of adult-child orders between the teacher and children.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, ecological thinking has been applied to a range of social, cultural and aesthetic systems, including performing arts as a living system of policy makers, producers, organisations, artists and audiences. Ecological thinking is systems-based thinking which allows us to see the performing arts as a complex and protean ecosystem; to explain how elements in this system act and interact; and to evaluate its effects on Australia’s social fabric over time. According to Gallasch, ecological thinking is “what we desperately need for the arts.” It enables us to “defeat the fragmentary and utilitarian view of the arts that dominates, to make connections, to establish overviews of the arts that can be shared and debated” (Gallasch NP). John Baylis took up these issues in "Mapping Queensland Theatre" (2009), an Arts Queensland-funded survey designed to map practices in Brisbane and in Queensland more broadly, and to provide a platform to support future policy-making. In this paper, we propose a new approach to mapping Brisbane’s and Queensland’s theatre that extends Baylis’ ‘value chain’ into a ‘value ecology’ that provides a more textured picture of players, patterns, relationships and activity levels in local performing arts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airports and cities inevitably recognise the value that each brings the other; however, the separation in decision-making authority for what to build, where, when and how provides a conundrum for both parties. Airports often want a say in what is developed outside of the airport fence, and cities often want a say in what is developed inside the airport fence. Defining how much of a say airports and cities have in decisions beyond their jurisdictional control is likely to be a topic that continues so long as airports and cities maintain separate formal decision-making processes for what to build, where, when and how. However, the recent Green and White Papers for a new National Aviation Policy have made early inroads to formalising relationships between Australia’s major airports and their host cities. At present, no clear indication (within practice or literature) is evident to the appropriateness of different governance arrangements for decisions to develop in situations that bring together the opposing strategic interests of airports and cities; thus leaving decisions for infrastructure development as complex decision-making spaces that hold airport and city/regional interests at stake. The line of enquiry is motivated by a lack of empirical research on networked decision-making domains outside of the realm of institutional theorists (Agranoff & McGuire, 2001; Provan, Fish & Sydow, 2007). That is, governance literature has remained focused towards abstract conceptualisations of organisation, without focusing on the minutia of how organisation influences action in real-world applications. A recent study by Black (2008) has provided an initial foothold for governance researchers into networked decision-making domains. This study builds upon Black’s (2008) work by aiming to explore and understand the problem space of making decisions subjected to complex jurisdictional and relational interdependencies. That is, the research examines the formal and informal structures, relationships, and forums that operationalise debates and interactions between decision-making actors as they vie for influence over deciding what to build, where, when and how in airport-proximal development projects. The research mobilises a mixture of qualitative and quantitative methods to examine three embedded cases of airport-proximal development from a network governance perspective. Findings from the research provide a new understanding to the ways in which informal actor networks underpin and combine with formal decision-making networks to create new (or realigned) governance spaces that facilitate decision-making during complex phases of development planning. The research is timely, and responds well to Isett, Mergel, LeRoux, Mischen and Rethemeyer’s (2011) recent critique of limitations within current network governance literature, specifically to their noted absence of empirical studies that acknowledge and interrogate the simultaneity of formal and informal network structures within network governance arrangements (Isett et al., 2011, pp. 162-166). The combination of social network analysis (SNA) techniques and thematic enquiry has enabled findings to document and interpret the ways in which decision-making actors organise to overcome complex problems for planning infrastructure. An innovative approach to using association networks has been used to provide insights to the importance of the different ways actors interact with one another, thus providing a simple yet valuable addition to the increasingly popular discipline of SNA. The research also identifies when and how different types of networks (i.e. formal and informal) are able to overcome currently known limitations to network governance (see McGuire & Agranoff, 2011), thus adding depth to the emerging body of network governance literature surrounding limitations to network ways of working (i.e. Rhodes, 1997a; Keast & Brown, 2002; Rethemeyer & Hatmaker, 2008; McGuire & Agranoff, 2011). Contributions are made to practice via the provision of a timely understanding of how horizontal fora between airports and their regions are used, particularly in the context of how they reframe the governance of decision-making for airport-proximal infrastructure development. This new understanding will enable government and industry actors to better understand the structural impacts of governance arrangements before they design or adopt them, particularly for factors such as efficiency of information, oversight, and responsiveness to change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The knowledge economy relies on the diffusion and use of knowledge as well as its creation (Houghton and Sheenan, 2000). The future success of economic activity will depend on the capacity of organisations to transform by increasing their flexibility. In particular, this transformation is dependant on a decentralised, networked and multi-skilled workforce. To help organisations transition, new strategies and structures for education are required. Education systems need to concentrate less on specialist skills and more on the development of people with broad-based problem solving skills that are adaptable, with social and inter-personal communication skills necessary for networking and communication. This paper presents the findings of a ‘Knowledge Economy Market Development Mapping Study’ conducted to identify the value of design education programs from primary through to tertiary level in Queensland, Australia. The relationship of these programs to the development of the capacities mentioned above is explored. The study includes the collection of qualitative and quantitative data consisting of a literature review, focus groups and survey. Recommendations for the future development of design education programs in Queensland, Australia are proposed, and future research opportunities are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Teachers of construction economics and estimating have for a long time recognised that there is more to construction pricing than detailed calculation of costs (to the contractor). We always get to the point where we have to say "of course, experience or familiarity of the market is very important and this needs judgement, intuition, etc". Quite how important is the matter in construction pricing is not known and we tend to trivialise its effect. If judgement of the market has a minimal effect, little harm would be done, but if it is really important then some quite serious consequences arise which go well beyond the teaching environment. Major areas of concern for the quantity surveyor are in cost modelling and cost planning - neither of which pay any significant attention to the market effect. There are currently two schools of thought about the market effect issue. The first school is prepared to ignore possible effects until more is known. This may be called the pragmatic school. The second school exists solely to criticise the first school. We will call this the antagonistic school. Neither the pragmatic nor the antagonistic schools seem to be particularly keen to resolve the issue one way or the other. The founder and leader of the antagonistic school is Brian Fine whose paper in 1974 is still the basic text on the subject, and in which he coined the term 'socially acceptable' price to describe what we now recognise as the market effect. Mr Fine's argument was then, and is since, that the uncertainty surrounding the contractors' costing and cost estimating process is such that the uncertainty surrounding the contractors' cost that it logically leads to a market-orientated pricing approach. Very little factual evidence, however, seems to be available to support these arguments in any conclusive manner. A further, and more important point for the pragmatic school, is that, even if the market effect is as important as Mr Fine believes, there are no indications of how it can be measured, evaluated or predicted. Since 1974 evidence has been accumulating which tends to reinforce the antagonists' view. A review of the literature covering both contractors' and designers' estimates found many references to the use of value judgements in construction pricing (Ashworth & Skitmore, 1985), which supports the antagonistic view in implying the existence of uncertainty overload. The most convincing evidence emerged quite by accident in some research we recently completed with practicing quantity surveyors in estimating accuracy (Skitmore, 1985). In addition to demonstrating that individual quantity surveyors and certain types of buildings had significant effect on estimating accuracy, one surprise result was that only a very small amount of information was used by the most expert surveyors for relatively very accurate estimates. Only the type and size of building, it seemed, was really relevant in determining accuracy. More detailed information about the buildings' specification, and even a sight to the drawings, did not significantly improve their accuracy level. This seemed to offer clear evidence that the constructional aspects of the project were largely irrelevant and that the expert surveyors were somehow tuning in to the market price of the building. The obvious next step is to feed our expert surveyors with more relevant 'market' information in order to assess its effect. The problem with this is that our experts do not seem able to verbalise their requirements in this respect - a common occurrence in research of this nature. The lack of research into the nature of market effects on prices also means the literature provides little of benefit. Hence the need for this study. It was felt that a clearer picture of the nature of construction markets would be obtained in an environment where free enterprise was a truly ideological force. For this reason, the United States of America was chosen for the next stage of our investigations. Several people were interviewed in an informal and unstructured manner to elicit their views on the action of market forces on construction prices. Although a small number of people were involved, they were thought to be reasonably representative of knowledge in construction pricing. They were also very well able to articulate their views. Our initial reaction to the interviews was that our USA subjects held very close views to those held in the UK. However, detailed analysis revealed the existence of remarkably clear and consistent insights that would not have been obtained in the UK. Further evidence was also obtained from literature relating to the subject and some of the interviewees very kindly expanded on their views in later postal correspondence. We have now analysed all the evidence received and, although a great deal is of an anecdotal nature, we feel that our findings enable at least the basic nature of the subject to be understood and that the factors and their interrelationships can now be examined more formally in relation to construction price levels. I must express my gratitude to the Royal Institution of Chartered Surveyors' Educational Trust and the University of Salford's Department of Civil Engineering for collectively funding this study. My sincere thanks also go to our American participants who freely gave their time and valuable knowledge to us in our enquiries. Finally, I must record my thanks to Tim and Anne for their remarkable ability to produce an intelligible typescript from my unintelligible writing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Whether to keep products segregated (e.g., unbundled) or integrate some or all of them (e.g., bundle) has been a problem of profound interest in areas such as portfolio theory in finance, risk capital allocations in insurance and marketing of consumer products. Such decisions are inherently complex and depend on factors such as the underlying product values and consumer preferences, the latter being frequently described using value functions, also known as utility functions in economics. In this paper, we develop decision rules for multiple products, which we generally call ‘exposure units’ to naturally cover manifold scenarios spanning well beyond ‘products’. Our findings show, e.g. that the celebrated Thaler's principles of mental accounting hold as originally postulated when the values of all exposure units are positive (i.e. all are gains) or all negative (i.e. all are losses). In the case of exposure units with mixed-sign values, decision rules are much more complex and rely on cataloging the Bell number of cases that grow very fast depending on the number of exposure units. Consequently, in the present paper, we provide detailed rules for the integration and segregation decisions in the case up to three exposure units, and partial rules for the arbitrary number of units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The early warning based on real-time prediction of rain-induced instability of natural residual slopes helps to minimise human casualties due to such slope failures. Slope instability prediction is complicated, as it is influenced by many factors, including soil properties, soil behaviour, slope geometry, and the location and size of deep cracks in the slope. These deep cracks can facilitate rainwater infiltration into the deep soil layers and reduce the unsaturated shear strength of residual soil. Subsequently, it can form a slip surface, triggering a landslide even in partially saturated soil slopes. Although past research has shown the effects of surface-cracks on soil stability, research examining the influence of deep-cracks on soil stability is very limited. This study aimed to develop methodologies for predicting the real-time rain-induced instability of natural residual soil slopes with deep cracks. The results can be used to warn against potential rain-induced slope failures. The literature review conducted on rain induced slope instability of unsaturated residual soil associated with soil crack, reveals that only limited studies have been done in the following areas related to this topic: - Methods for detecting deep cracks in residual soil slopes. - Practical application of unsaturated soil theory in slope stability analysis. - Mechanistic methods for real-time prediction of rain induced residual soil slope instability in critical slopes with deep cracks. Two natural residual soil slopes at Jombok Village, Ngantang City, Indonesia, which are located near a residential area, were investigated to obtain the parameters required for the stability analysis of the slope. A survey first identified all related field geometrical information including slope, roads, rivers, buildings, and boundaries of the slope. Second, the electrical resistivity tomography (ERT) method was used on the slope to identify the location and geometrical characteristics of deep cracks. The two ERT array models employed in this research are: Dipole-dipole and Azimuthal. Next, bore-hole tests were conducted at different locations in the slope to identify soil layers and to collect undisturbed soil samples for laboratory measurement of the soil parameters required for the stability analysis. At the same bore hole locations, Standard Penetration Test (SPT) was undertaken. Undisturbed soil samples taken from the bore-holes were tested in a laboratory to determine the variation of the following soil properties with the depth: - Classification and physical properties such as grain size distribution, atterberg limits, water content, dry density and specific gravity. - Saturated and unsaturated shear strength properties using direct shear apparatus. - Soil water characteristic curves (SWCC) using filter paper method. - Saturated hydraulic conductivity. The following three methods were used to detect and simulate the location and orientation of cracks in the investigated slope: (1) The electrical resistivity distribution of sub-soil obtained from ERT. (2) The profile of classification and physical properties of the soil, based on laboratory testing of soil samples collected from bore-holes and visual observations of the cracks on the slope surface. (3) The results of stress distribution obtained from 2D dynamic analysis of the slope using QUAKE/W software, together with the laboratory measured soil parameters and earthquake records of the area. It was assumed that the deep crack in the slope under investigation was generated by earthquakes. A good agreement was obtained when comparing the location and the orientation of the cracks detected by Method-1 and Method-2. However, the simulated cracks in Method-3 were not in good agreement with the output of Method-1 and Method-2. This may have been due to the material properties used and the assumptions made, for the analysis. From Method-1 and Method-2, it can be concluded that the ERT method can be used to detect the location and orientation of a crack in a soil slope, when the ERT is conducted in very dry or very wet soil conditions. In this study, the cracks detected by the ERT were used for stability analysis of the slope. The stability of the slope was determined using the factor of safety (FOS) of a critical slip surface obtained by SLOPE/W using the limit equilibrium method. Pore-water pressure values for the stability analysis were obtained by coupling the transient seepage analysis of the slope using finite element based software, called SEEP/W. A parametric study conducted on the stability of an investigated slope revealed that the existence of deep cracks and their location in the soil slope are critical for its stability. The following two steps are proposed to predict the rain-induced instability of a residual soil slope with cracks. (a) Step-1: The transient stability analysis of the slope is conducted from the date of the investigation (initial conditions are based on the investigation) to the preferred date (current date), using measured rainfall data. Then, the stability analyses are continued for the next 12 months using the predicted annual rainfall that will be based on the previous five years rainfall data for the area. (b) Step-2: The stability of the slope is calculated in real-time using real-time measured rainfall. In this calculation, rainfall is predicted for the next hour or 24 hours and the stability of the slope is calculated one hour or 24 hours in advance using real time rainfall data. If Step-1 analysis shows critical stability for the forthcoming year, it is recommended that Step-2 be used for more accurate warning against the future failure of the slope. In this research, the results of the application of the Step-1 on an investigated slope (Slope-1) showed that its stability was not approaching a critical value for year 2012 (until 31st December 2012) and therefore, the application of Step-2 was not necessary for the year 2012. A case study (Slope-2) was used to verify the applicability of the complete proposed predictive method. A landslide event at Slope-2 occurred on 31st October 2010. The transient seepage and stability analyses of the slope using data obtained from field tests such as Bore-hole, SPT, ERT and Laboratory tests, were conducted on 12th June 2010 following the Step-1 and found that the slope in critical condition on that current date. It was then showing that the application of the Step-2 could have predicted this failure by giving sufficient warning time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore the role of leadership in problem-oriented policing (POP). Design/methodology/approach – This paper uses interrupted time series models to isolate the impact on crime trends of a transformational leader's efforts to spearhead the implementation of a program of POP, called the problem solving model (PSM), in a southern state in Australia. Findings – This paper finds that the PSM led directly to an impact on overall crime, with a significant reduction in crimes per 100,000 persons per year after the introduction of the PSM. The majority of the overall crime drop attributable to implementation of POP was driven by reductions in property crime. It was noted that the leadership influence of the PSM was not effective in reducing all types of crime. Crimes against the person where not affected by the introduction of the PSM and public nuisance crimes largely followed the forecasted, upward trajectory. Practical implications – The driver behind the PSM was Commissioner Hyde and the success of the PSM is largely attributable to his strong commitment to transformational leadership and a top-down approach to implementation. These qualities encapsulate the original ideas behind POP that Goldstein (1979, 2003), back in 1979, highlighted as critical for the success of future POP programs. Social implications – Reducing crime is an important part of creating safe communities and improving quality of life for all citizens. This research shows that successful implementation of the PSM within South Australia under the strong leadership of Commissioner Hyde was a major factor in reducing property crime and overall crime rates. Originality/value – This paper is valuable because it demonstrates the link between strong leadership in policing, the commissioner's vision for POP and how his vision then translated into widespread adoption of POP. The study empirically shows that the statewide adoption of POP led to significant reductions in crime, particularly property crime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the initial phase of a Professional Learning Program (PLP) undertaken by 100 primary school teachers in China that aimed to facilitate the development of adaptive expertise in using technology to facilitate innovative science teaching and learning such as that envisaged by the Chinese Ministry of Education’s (2010-2020) education reforms. Key principles derived from literature about professional learning and scaffolding of learning informed the design of the PLP. The analysis of data revealed that the participants had made substantial progress towards the development of adaptive expertise. This was manifested not only by advances in the participants’ repertoires of Subject Matter Knowledge and Pedagogical Content Knowledge but also in changes to their levels of confidence and identities as teachers. By the end of the initial phase of the PLP, the participants had coalesced into a professional learning community that readily engaged in the sharing, peer review, reuse and adaption, and collaborative design of innovative science learning and assessment activities. The findings from the study indicate that those engaged in the development of PLPs for teachers in China need to take cognizance of certain cultural factors and traditions idiosyncratic to the Chinese educational system. A set of revised principles is then presented to inform the future design and implementation of PLPs for teachers in China.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novelty-biased cumulative gain (α-NDCG) has become the de facto measure within the information retrieval (IR) community for evaluating retrieval systems in the context of sub-topic retrieval. Setting the incorrect value of parameter α in α-NDCG prevents the measure from behaving as desired in particular circumstances. In fact, when α is set according to common practice (i.e. α = 0.5), the measure favours systems that promote redundant relevant sub-topics rather than provide novel relevant ones. Recognising this characteristic of the measure is important because it affects the comparison and the ranking of retrieval systems. We propose an approach to overcome this problem by defining a safe threshold for the value of α on a query basis. Moreover, we study its impact on system rankings through a comprehensive simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parents whose children are identified as having experienced or being at risk of experiencing significant harm potentially provide an invaluable dimension to our understanding of the circumstances that result in child abuse or neglect and how best to respond to these invariably complex situations. This paper reports findings from a study of the experiences of six parents. In-depth interviews were conducted with four mothers and two fathers who had been referred to an intensive family support services by the Queensland statutory child protection authority. Using a critical ecological perspective, the study focused on identifying and understanding the experiences of the parents in using formal family support services, including aspects of service delivery that were helpful or unhelpful. Parents also commented on their experiences of statutory child protection services. Service components and worker qualities that parents identified as being helpful included being accessible, targeted and integrated and being able to meet a continuum of needs, from a micro to a broader level. Their reports provide invaluable insight into how formal family support services, including child protection services, can better meet the needs of parents in addressing the recurring problem of child maltreatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research falls in the area of enhancing the quality of tag-based item recommendation systems. It aims to achieve this by employing a multi-dimensional user profile approach and by analyzing the semantic aspects of tags. Tag-based recommender systems have two characteristics that need to be carefully studied in order to build a reliable system. Firstly, the multi-dimensional correlation, called as tag assignment , should be appropriately modelled in order to create the user profiles [1]. Secondly, the semantics behind the tags should be considered properly as the flexibility with their design can cause semantic problems such as synonymy and polysemy [2]. This research proposes to address these two challenges for building a tag-based item recommendation system by employing tensor modeling as the multi-dimensional user profile approach, and the topic model as the semantic analysis approach. The first objective is to optimize the tensor model reconstruction and to improve the model performance in generating quality rec-ommendation. A novel Tensor-based Recommendation using Probabilistic Ranking (TRPR) method [3] has been developed. Results show this method to be scalable for large datasets and outperforming the benchmarking methods in terms of accuracy. The memory efficient loop implements the n-mode block-striped (matrix) product for tensor reconstruction as an approximation of the initial tensor. The probabilistic ranking calculates the probabil-ity of users to select candidate items using their tag preference list based on the entries generated from the reconstructed tensor. The second objective is to analyse the tag semantics and utilize the outcome in building the tensor model. This research proposes to investigate the problem using topic model approach to keep the tags nature as the “social vocabulary” [4]. For the tag assignment data, topics can be generated from the occurrences of tags given for an item. However there is only limited amount of tags availa-ble to represent items as collection of topics, since an item might have only been tagged by using several tags. Consequently, the generated topics might not able to represent the items appropriately. Furthermore, given that each tag can belong to any topics with various probability scores, the occurrence of tags cannot simply be mapped by the topics to build the tensor model. A standard weighting technique will not appropriately calculate the value of tagging activity since it will define the context of an item using a tag instead of a topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate process model elicitation continues to be a time consuming task, requiring skill on the part of the interviewer to extract explicit and tacit process information from the interviewee. Many errors occur in this elicitation stage that would be avoided by better activity recall, more consistent specification methods and greater engagement in the elicitation process by interviewees. Metasonic GmbH has developed a process elicitation tool for their process suite. As part of a research engagement with Metasonic, staff from QUT, Australia have developed a 3D virtual world approach to the same problem, viz. eliciting process models from stakeholders in an intuitive manner. This book chapter tells the story of how QUT staff developed a 3D Virtual World tool for process elicitation, took the outcomes of their research project to Metasonic for evaluation, and finally, Metasonic’s response to the initial proof of concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An increasing range of technology services are now offered on a self-service basis. However, problems with self-service technologies (SSTs) occur at times due to the technical error, staff error, or consumers’ own mistakes. Considering the role of consumers as co-producers in the SST context, we aim to study consumer’s behaviours, strategies, and decision making in solving their problem with SST and identify the factors contributing to their persistence in solving the problem. This study contributes to the information systems research, as it is the first study that aims to identify such a process and the factors affecting consumers’ persistence in solving their problem with SST. A focus group with user support staff has been conducted, yielding some initial results that helped to conduct the next phases of the study. Next, using Critical Incident Technique, data will be gathered through focus groups with users, diary method, and think-aloud method.