958 resultados para Observational techniques and algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the explosive growth of resources available through the Internet, information mismatching and overload have become a severe concern to users. Web users are commonly overwhelmed by huge volume of information and are faced with the challenge of finding the most relevant and reliable information in a timely manner. Personalised information gathering and recommender systems represent state-of-the-art tools for efficient selection of the most relevant and reliable information resources, and the interest in such systems has increased dramatically over the last few years. However, web personalization has not yet been well-exploited; difficulties arise while selecting resources through recommender systems from a technological and social perspective. Aiming to promote high quality research in order to overcome these challenges, this paper provides a comprehensive survey on the recent work and achievements in the areas of personalised web information gathering and recommender systems. The report covers concept-based techniques exploited in personalised information gathering and recommender systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research is a step forward in improving the accuracy of detecting anomaly in a data graph representing connectivity between people in an online social network. The proposed hybrid methods are based on fuzzy machine learning techniques utilising different types of structural input features. The methods are presented within a multi-layered framework which provides the full requirements needed for finding anomalies in data graphs generated from online social networks, including data modelling and analysis, labelling, and evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Peggy Shaw’s RUFF, (USA 2013) and Queensland Theatre Company’s collaboration with Queensland University of Technology, Total Dik!, (Australia 2013) overtly and evocatively draw on an aestheticized use of the cinematic techniques and technologies of Chroma Key to reveal the tensions in their production and add layers to their performances. In doing so they offer invaluable insight where the filmic and theatrical approaches overlap. This paper draws on Eckersall, Grehan and Scheer’s New Media Dramaturgy (2014) to reposition the frame as a contribution to intermedial theatre and performance practices in light of increasing convergence between seemingly disparate discourses. In RUFF, the scenic environment replicates a chroma-key ‘studio’ which facilitates the reconstruction of memory displaced after a stroke. RUFF uses the screen and projections to recall crooners, lounge singers, movie stars, rock and roll bands, and an eclectic line of eccentric family members living inside Shaw. While the show pays tribute to those who have kept her company across decades of theatrical performance, use of non-composited chroma-key technique as a theatrical device and the work’s taciturn revelation of the production process during performance, play a central role in its exploration of the juxtaposition between its reconstructed form and content. In contrast Total Dik! uses real-time green screen compositing during performance as a scenic device. Actors manipulate scale models, refocus cameras and generate scenes within scenes in the construction of the work’s examination of an isolated Dictator. The ‘studio’ is again replicated as a site for (re)construction, only in this case Total Dik! actively seeks to reveal the process of production as the performance plays out. Building on RUFF, and other works such as By the Way, Meet Vera Stark, (2012) and Hotel Modern’s God’s Beard (2012), this work blends a convergence of mobile technologies, models, and green screen capture to explore aspects of transmedia storytelling in a theatrical environment (Jenkins, 2009, 2013). When a green screen is placed on stage, it reads at once as metaphor and challenge to the language of theatre. It becomes, or rather acts, as a ‘sign’ that alludes to the nature of the reconstructed, recomposited, manipulated and controlled. In RUFF and in Total Dik!, it is also a place where as a mode of production and subsequent reveal, it adds weight to performance. These works are informed by Auslander (1999) and Giesenkam (2007) and speak to and echo Lehmann’s Postdramatic Theatre (2006). This paper’s consideration of the integration of studio technique and live performance as a dynamic approach to multi-layered theatrical production develops our understanding of their combinatory use in a live performance environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eleanor Smith [pseudonym], teacher : I was talking to the kids about MacDonalds*/I forget exactly what the context was*/I said ‘‘ah, the Americans call them French fries, and, you know, MacDonalds is an American chain and they call them French fries because the Americans call them French fries’’, and this little Australian kid in the front row, very Australian child, said to me, ‘‘I call them French fries!’’ . . . Um, a fourth grade boy whom I taught in 1993 at this school, the world basketball championships were on . . . Americans were playing their dream machine and the Boomers were up against them . . . and, ah, this boy was very interested in basketball . . . but it’s not in my blood, not in the way cricket is for example . . . Um, Um, and I said to this fellow, ‘‘um, well’’, I said, ‘‘Australia’s up against Dream Machine tomorrow’’. He [Jason, pseudonym] said, ‘‘Ah, you know, Boomers probably won’t win’’. . . . I said, ‘‘Well that’s sport, mate’’. I said, ‘‘You never know in sport. Australia might win’’. And he looked at me and he said, ‘‘I’m not going for Australia, I’m going for America’’. This is from an Australian boy! And I thought so strong is the hype, so strong is the, is the, power of the media, etc., that this boy is not [pause], I can’t tell you how outraged I was. Here’s me as an Australian and I don’t even support basketball, it’s not even my sport, um, but that he would respond like that because of the power of the American machine that’s converting kids’ minds, the way they think, where they’re putting their loyalties, etc. I was just appalled, but that’s where he was. And when I asked kids for their favourite place, he said Los Angeles.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sampling design is critical to the quality of quantitative research, yet it does not always receive appropriate attention in nursing research. The current article details how balancing probability techniques with practical considerations produced a representative sample of Australian nursing homes (NHs). Budgetary, logistical, and statistical constraints were managed by excluding some NHs (e.g., those too difficult to access) from the sampling frame; a stratified, random sampling methodology yielded a final sample of 53 NHs from a population of 2,774. In testing the adequacy of representation of the study population, chi-square tests for goodness of fit generated nonsignificant results for distribution by distance from major city and type of organization. A significant result for state/territory was expected and was easily corrected for by the application of weights. The current article provides recommendations for conducting high-quality, probability-based samples and stresses the importance of testing the representativeness of achieved samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The analysis of sequential data is required in many diverse areas such as telecommunications, stock market analysis, and bioinformatics. A basic problem related to the analysis of sequential data is the sequence segmentation problem. A sequence segmentation is a partition of the sequence into a number of non-overlapping segments that cover all data points, such that each segment is as homogeneous as possible. This problem can be solved optimally using a standard dynamic programming algorithm. In the first part of the thesis, we present a new approximation algorithm for the sequence segmentation problem. This algorithm has smaller running time than the optimal dynamic programming algorithm, while it has bounded approximation ratio. The basic idea is to divide the input sequence into subsequences, solve the problem optimally in each subsequence, and then appropriately combine the solutions to the subproblems into one final solution. In the second part of the thesis, we study alternative segmentation models that are devised to better fit the data. More specifically, we focus on clustered segmentations and segmentations with rearrangements. While in the standard segmentation of a multidimensional sequence all dimensions share the same segment boundaries, in a clustered segmentation the multidimensional sequence is segmented in such a way that dimensions are allowed to form clusters. Each cluster of dimensions is then segmented separately. We formally define the problem of clustered segmentations and we experimentally show that segmenting sequences using this segmentation model, leads to solutions with smaller error for the same model cost. Segmentation with rearrangements is a novel variation to the segmentation problem: in addition to partitioning the sequence we also seek to apply a limited amount of reordering, so that the overall representation error is minimized. We formulate the problem of segmentation with rearrangements and we show that it is an NP-hard problem to solve or even to approximate. We devise effective algorithms for the proposed problem, combining ideas from dynamic programming and outlier detection algorithms in sequences. In the final part of the thesis, we discuss the problem of aggregating results of segmentation algorithms on the same set of data points. In this case, we are interested in producing a partitioning of the data that agrees as much as possible with the input partitions. We show that this problem can be solved optimally in polynomial time using dynamic programming. Furthermore, we show that not all data points are candidates for segment boundaries in the optimal solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diffuse large B-cell lymphoma (DLBCL) is the most common of the non-Hodgkin lymphomas. As DLBCL is characterized by heterogeneous clinical and biological features, its prognosis varies. To date, the International Prognostic Index has been the strongest predictor of outcome for DLBCL patients. However, no biological characters of the disease are taken into account. Gene expression profiling studies have identified two major cell-of-origin phenotypes in DLBCL with different prognoses, the favourable germinal centre B-cell-like (GCB) and the unfavourable activated B-cell-like (ABC) phenotypes. However, results of the prognostic impact of the immunohistochemically defined GCB and non-GCB distinction are controversial. Furthermore, since the addition of the CD20 antibody rituximab to chemotherapy has been established as the standard treatment of DLBCL, all molecular markers need to be evaluated in the post-rituximab era. In this study, we aimed to evaluate the predictive value of immunohistochemically defined cell-of-origin classification in DLBCL patients. The GCB and non-GCB phenotypes were defined according to the Hans algorithm (CD10, BCL6 and MUM1/IRF4) among 90 immunochemotherapy- and 104 chemotherapy-treated DLBCL patients. In the chemotherapy group, we observed a significant difference in survival between GCB and non-GCB patients, with a good and a poor prognosis, respectively. However, in the rituximab group, no prognostic value of the GCB phenotype was observed. Likewise, among 29 high-risk de novo DLBCL patients receiving high-dose chemotherapy and autologous stem cell transplantation, the survival of non-GCB patients was improved, but no difference in outcome was seen between GCB and non-GCB subgroups. Since the results suggested that the Hans algorithm was not applicable in immunochemotherapy-treated DLBCL patients, we aimed to further focus on algorithms based on ABC markers. We examined the modified activated B-cell-like algorithm based (MUM1/IRF4 and FOXP1), as well as a previously reported Muris algorithm (BCL2, CD10 and MUM1/IRF4) among 88 DLBCL patients uniformly treated with immunochemotherapy. Both algorithms distinguished the unfavourable ABC-like subgroup with a significantly inferior failure-free survival relative to the GCB-like DLBCL patients. Similarly, the results of the individual predictive molecular markers transcription factor FOXP1 and anti-apoptotic protein BCL2 have been inconsistent and should be assessed in immunochemotherapy-treated DLBCL patients. The markers were evaluated in a cohort of 117 patients treated with rituximab and chemotherapy. FOXP1 expression could not distinguish between patients, with favourable and those with poor outcomes. In contrast, BCL2-negative DLBCL patients had significantly superior survival relative to BCL2-positive patients. Our results indicate that the immunohistochemically defined cell-of-origin classification in DLBCL has a prognostic impact in the immunochemotherapy era, when the identifying algorithms are based on ABC-associated markers. We also propose that BCL2 negativity is predictive of a favourable outcome. Further investigational efforts are, however, warranted to identify the molecular features of DLBCL that could enable individualized cancer therapy in routine patient care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A spanning tree T of a graph G is said to be a tree t-spanner if the distance between any two vertices in T is at most t times their distance in G. A graph that has a tree t-spanner is called a tree t-spanner admissible graph. The problem of deciding whether a graph is tree t-spanner admissible is NP-complete for any fixed t >= 4 and is linearly solvable for t <= 2. The case t = 3 still remains open. A chordal graph is called a 2-sep chordal graph if all of its minimal a - b vertex separators for every pair of non-adjacent vertices a and b are of size two. It is known that not all 2-sep chordal graphs admit tree 3-spanners This paper presents a structural characterization and a linear time recognition algorithm of tree 3-spanner admissible 2-sep chordal graphs. Finally, a linear time algorithm to construct a tree 3-spanner of a tree 3-spanner admissible 2-sep chordal graph is proposed. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lime-fly ash mixtures are exploited for the manufacture of fly ash bricks finding applications in load bearing masonry. Lime-pozzolana reactions take place at a slow pace under ambient temperature conditions and hence very long curing durations are required to achieve meaningful strength values. The present investigation examines the improvements in strength development in lime-fly ash compacts through low temperature steam curing and use of additives like gypsum. Results of density-strength-moulding water content relationships, influence of lime-fly ash ratio, steam curing and role of gypsum on strength development, and characteristics of compacted lime-fly ash-gypsum bricks have been discussed. The test results reveal that (a) strength increases with increase in density irrespective of lime content, type of curing and moulding water content, (b) optimum lime-fly ash ratio yielding maximum strength is about 0.75 in the normal curing conditions, (c) 24 h of steam curing (at 80A degrees C) is sufficient to achieve nearly possible maximum strength, (d) optimum gypsum content yielding maximum compressive strength is at 2%, (e) with gypsum additive it is possible to obtain lime-fly ash bricks or blocks having sufficient strength (> 10 MPa) at 28 days of normal wet burlap curing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wireless sensor networks can often be viewed in terms of a uniform deployment of a large number of nodes in a region of Euclidean space. Following deployment, the nodes self-organize into a mesh topology with a key aspect being self-localization. Having obtained a mesh topology in a dense, homogeneous deployment, a frequently used approximation is to take the hop distance between nodes to be proportional to the Euclidean distance between them. In this work, we analyze this approximation through two complementary analyses. We assume that the mesh topology is a random geometric graph on the nodes; and that some nodes are designated as anchors with known locations. First, we obtain high probability bounds on the Euclidean distances of all nodes that are h hops away from a fixed anchor node. In the second analysis, we provide a heuristic argument that leads to a direct approximation for the density function of the Euclidean distance between two nodes that are separated by a hop distance h. This approximation is shown, through simulation, to very closely match the true density function. Localization algorithms that draw upon the preceding analyses are then proposed and shown to perform better than some of the well-known algorithms present in the literature. Belief-propagation-based message-passing is then used to further enhance the performance of the proposed localization algorithms. To our knowledge, this is the first usage of message-passing for hop-count-based self-localization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

结合纳米硬度技术测量各类薄膜和块体材料表层的纳米压痕硬度、弹性模量、断裂韧性、膜厚、微结构的弯曲变形,采用纳米划痕硬度技术测量各类薄膜和块体材料的粗糙度、临界附着力、摩擦系数、划痕横剖面.纳米硬度计是检测材料表层微米乃至几十纳米力学性能的先进仪器,可广泛应用于表面工程中的质量检测.