178 resultados para Set-valued map
Resumo:
The Valley Mountain 15’ quadrangle straddles the Pinto Mountain Fault, which bounds the eastern Transverse Ranges in the south against the Mojave Desert province in the north. The Pinto Mountains, part of the eastern Transverse Ranges in the south part of the quadrangle expose a series of Paleoproterozoic gneisses and granite and the Proterozoic quartzite of Pinto Mountain. Early Triassic quartz monzonite intruded the gneisses and was ductiley deformed prior to voluminous Jurassic intrusion of diorite, granodiorite, quartz monzonite, and granite plutons. The Jurassic rocks include part of the Bullion Mountains Intrusive Suite, which crops out prominently at Valley Mountain and in the Bullion Mountains, as well as in the Pinto Mountains. Jurassic plutons in the southwest part of the quadrangle are deeply denuded from midcrustal emplacement levels in contrast to supracrustal Jurassic limestone and volcanic rocks exposed in the northeast. Dikes inferred to be part of the Jurassic Independence Dike Swarm intrude the Jurassic plutons and Proterozoic rocks. Late Cretaceous intrusion of the Cadiz Valley Batholith in the northeast caused contact metamorphism of adjacent Jurassic plutonic rocks...
Resumo:
The Sessional Academic Success (SAS) project is a sustainable, distributed model for supporting sessional staff at QUT. Developed by the Learning and Teaching Unit. SAS complements our Sessional Academic Program (SAP): a sequence of formal academic development workshops explained in complementary nomination. SAS recognises that while these programs are very well received and a crucial aspect of preparing and advancing sessional teachers, they are necessarily encapsulated in the moment of their delivery and are generic, as they address all faculties (with their varied cultures, processes and pedagogies). The SAS project extends this formal, centrally offered activity into local, ‘just in time’, ongoing support within schools. It takes a distributed leadership approach. Experienced sessional academics are recruited and employed as Sessional Academic Success Advisors (SASAs). They provide sessional staff in their schools with contextually specific, needs based, peer-to-peer development opportunities; one-on-one advice on classroom management and strategies for success; and help to trouble-shoot challenges. The SASAs are trained by the Learning and Teaching Unit co-ordinator, and ongoing support is provided centrally and by school-based co-ordinators. This team approach situates the SASAs at the centre of an organisation map (see diagram of support relationships below). The SAS project aims to support sessional staff in their professional development by: • Offering contextual, needs-based support at school level by harnessing local expertise; • Providing further development opportunities that are local and focal; SAS aims to retain Sessional Staff by: • Responding to self-nominated requests for support and ‘just in time’, safe and reliable advice in times of need; • Building sessional staff confidence through help with dealing with challenges from a trusted peer; • Building a supportive academic community for sessional staff, which helps them feel a part of faculty life, and a community of teaching practice. SAS aims to support sessional staff in the development of academic teaching careers by: • Recognising the capacity of experienced sessional staff to support their peers in ways that are unique, valuable and valued and providing the agency to do so; • Providing career advancement and leadership opportunities for sessional staff. SAS takes unique approaches within each school using strategies such as: • Welcomes and schools orientation by SASAs; • Regular check ins; face-to-face advice and online support; • Compiling local resources to complement university wide resources. • Sessional-to-sessional ‘just in time’ training (eg. assessment and marking when marking commences); • Peer feedback and mentoring (the opportunities to sit in more experiences sessionals’ classes; • Sessional staff awards (nominated by students); • Communities of practice to discuss topics and issues with a view to (and support for) publishing on learning and teaching. In these ways, SASAs complement support offered by unit coordinators, administrators, and the Learning and Teaching Unit. Pairing senior and ‘understudy’ advisors ensures a line of succession, sustainability and continuity. A pilot program commenced in 2012 involving three schools (Psychology and Social Work; Electrical Engineering and Computer Science; Media, Entertainment and Creative Arts). It will be expanded across schools in 2013.
Resumo:
Textual document set has become an important and rapidly growing information source in the web. Text classification is one of the crucial technologies for information organisation and management. Text classification has become more and more important and attracted wide attention of researchers from different research fields. In this paper, many feature selection methods, the implement algorithms and applications of text classification are introduced firstly. However, because there are much noise in the knowledge extracted by current data-mining techniques for text classification, it leads to much uncertainty in the process of text classification which is produced from both the knowledge extraction and knowledge usage, therefore, more innovative techniques and methods are needed to improve the performance of text classification. It has been a critical step with great challenge to further improve the process of knowledge extraction and effectively utilization of the extracted knowledge. Rough Set decision making approach is proposed to use Rough Set decision techniques to more precisely classify the textual documents which are difficult to separate by the classic text classification methods. The purpose of this paper is to give an overview of existing text classification technologies, to demonstrate the Rough Set concepts and the decision making approach based on Rough Set theory for building more reliable and effective text classification framework with higher precision, to set up an innovative evaluation metric named CEI which is very effective for the performance assessment of the similar research, and to propose a promising research direction for addressing the challenging problems in text classification, text mining and other relative fields.
Resumo:
Stigmergy is a biological term used when discussing a sub-set of insect swarm-behaviour describing the apparent organisation seen during their activities. Stigmergy describes a communication mechanism based on environment-mediated signals which trigger responses among the insects. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, where the pheromones are a form of environment-mediated communication. What is interesting with this phenomenon is that highly organized societies are achieved without an apparent management structure. Stigmergy is also observed in human environments, both natural and engineered. It is implicit in the Web where sites provide a virtual environment supporting coordinative contributions. Researchers in varying disciplines appreciate the power of this phenomenon and have studied how to exploit it. As stigmergy becomes more widely researched we see its definition mutate as papers citing original work become referenced themselves. Each paper interprets these works in ways very specific to the research being conducted. Our own research aims to better understand what improves the collaborative function of a Web site when exploiting the phenomenon. However when researching stigmergy to develop our understanding we discover a lack of a standardized and abstract model for the phenomenon. Papers frequently cited the same generic descriptions before becoming intimately focused on formal specifications of an algorithm, or esoteric discussions regarding sub-facets of the topic. None provide a holistic and macro-level view to model and standardize the nomenclature. This paper provides a content analysis of influential literature documenting the numerous theoretical and experimental papers that have focused on stigmergy. We establish that stigmergy is a phenomenon that transcends the insect world and is more than just a metaphor when applied to the human world. We present from our own research our general theory and abstract model of semantics of stigma in stigmergy. We hope our model will clarify the nuances of the phenomenon into a useful road-map, and standardise vocabulary that we witness becoming confused and divergent. Furthermore, this paper documents the analysis on which we base our next paper: Special Theory of Stigmergy: A Design Pattern for Web 2.0 Collaboration.
Resumo:
MapReduce is a computation model for processing large data sets in parallel on large clusters of machines, in a reliable, fault-tolerant manner. A MapReduce computation is broken down into a number of map tasks and reduce tasks, which are performed by so called mappers and reducers, respectively. The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generation of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics and it can obtain a better solution in a reasonable time. Furthermore, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement which puts a fixed number of mapper/reducer on each machine. The comparison results show that the computation using our mapper/reducer placement is much cheaper than the computation using the conventional placement while still satisfying the computation deadline.
Resumo:
The objective of this paper is to explore the relationship between dynamic capabilities and different types of online innovations. Building on qualitative data from the publishing industry, our analysis revealed that companies that had relatively strong dynamic capabilities in all three areas (sensing, seizing and reconfiguration) seem to produce innovations that combine their existing capabilities on either the market or the technology dimension with new capabilities on the other dimension thus resulting in niche creation and revolutionary type innovations. Correspondingly, companies with a weaker or more one-sided set of dynamic capabilities seem to produce more radical innovations requiring both new market and technological capabilities. The study therefore provides an empirical contribution to the emerging work on dynamic capabilities through its in-depth investigation of the capabilities of the four case firms, and by mapping the patterns between the firm's portfolio of dynamic capabilities and innovation outcomes.
Resumo:
Computational models represent a highly suitable framework, not only for testing biological hypotheses and generating new ones but also for optimising experimental strategies. As one surveys the literature devoted to cancer modelling, it is obvious that immense progress has been made in applying simulation techniques to the study of cancer biology, although the full impact has yet to be realised. For example, there are excellent models to describe cancer incidence rates or factors for early disease detection, but these predictions are unable to explain the functional and molecular changes that are associated with tumour progression. In addition, it is crucial that interactions between mechanical effects, and intracellular and intercellular signalling are incorporated in order to understand cancer growth, its interaction with the extracellular microenvironment and invasion of secondary sites. There is a compelling need to tailor new, physiologically relevant in silico models that are specialised for particular types of cancer, such as ovarian cancer owing to its unique route of metastasis, which are capable of investigating anti-cancer therapies, and generating both qualitative and quantitative predictions. This Commentary will focus on how computational simulation approaches can advance our understanding of ovarian cancer progression and treatment, in particular, with the help of multicellular cancer spheroids, and thus, can inform biological hypothesis and experimental design.
Resumo:
In this chapter we seek to interrogate the methods and assumptions underpinning geocriticism by engaging with and reframing dominant ways of analysing mediated representations of Australian space in cultural narratives, specifically film, literature, and theatre. What, we ask, might geocriticism contribute to the analysis of Australian texts in which location figures prominently? We argue a geocritical approach may provide an interdisciplinary framework that offers a way of identifying tropes across geographic regions and across media representations. Drawing on scholarship spanning Australian cinematic, literary and theatrical narratives, this chapter surveys published work in the field and posits that a refined geocritical mapping and analysis of the cultural terrain foregrounds the significance of geography to culture and draws different traditions of spatial enquiry into dialogue without privileging any particular textual form. We conclude by scoping possibilities for future research emerging from recent technological developments in interactive online cartography.
Resumo:
In Thomas Mann’s tetralogy of the 1930s and 1940s, Joseph and His Brothers, the narrator declares history is not only “that which has happened and that which goes on happening in time,” but it is also “the stratified record upon which we set our feet, the ground beneath us.” By opening up history to its spatial, geographical, and geological dimensions Mann both predicts and encapsulates the twentieth-century’s “spatial turn,” a critical shift that divested geography of its largely passive role as history’s “stage” and brought to the fore intersections between the humanities and the earth sciences. In this paper, I draw out the relationships between history, narrative, geography, and geology revealed by this spatial turn and the questions these pose for thinking about the disciplinary relationship between geography and the humanities. As Mann’s statement exemplifies, the spatial turn itself has often been captured most strikingly in fiction, and I would argue nowhere more so than in Graham Swift’s Waterland (1983) and Anne Michaels’s Fugitive Pieces (1996), both of which present space, place, and landscape as having a palpable influence on history and memory. The geographical/geological line that runs through both Waterland and Fugitive Pieces continues through Tim Robinson’s non-fictional, two-volume “topographical” history Stones of Aran. Robinson’s Stones of Aran—which is not history, not geography, and not literature, and yet is all three—constructs an imaginative geography that renders inseparable geography, geology, history, memory, and the act of writing.
Resumo:
The last fifty years have witnessed the growing pervasiveness of the figure of the map in critical, theoretical, and fictional discourse. References to mapping and cartography are endemic in poststructuralist theory, and, similarly, geographically and culturally diverse authors of twentieth-century fiction seem fixated upon mapping. While the map metaphor has been employed for centuries to highlight issues of textual representation and epistemology, the map metaphor itself has undergone a transformation in the postmodern era. This metamorphosis draws together poststructuralist conceptualizations of epistemology, textuality, cartography, and metaphor, and signals a shift away from modernist preoccupations with temporality and objectivity to a postmodern pragmatics of spatiality and subjectivity. Cartographic Strategies of Postmodernity charts this metamorphosis of cartographic metaphor, and argues that the ongoing reworking of the map metaphor renders it a formative and performative metaphor of postmodernity.
Resumo:
Numeric set watermarking is a way to provide ownership proof for numerical data. Numerical data can be considered to be primitives for multimedia types such as images and videos since they are organized forms of numeric information. Thereby, the capability to watermark numerical data directly implies the capability to watermark multimedia objects and discourage information theft on social networking sites and the Internet in general. Unfortunately, there has been very limited research done in the field of numeric set watermarking due to underlying limitations in terms of number of items in the set and LSBs in each item available for watermarking. In 2009, Gupta et al. proposed a numeric set watermarking model that embeds watermark bits in the items of the set based on a hash value of the items’ most significant bits (MSBs). If an item is chosen for watermarking, a watermark bit is embedded in the least significant bits, and the replaced bit is inserted in the fractional value to provide reversibility. The authors show their scheme to be resilient against the traditional subset addition, deletion, and modification attacks as well as secondary watermarking attacks. In this paper, we present a bucket attack on this watermarking model. The attack consists of creating buckets of items with the same MSBs and determine if the items of the bucket carry watermark bits. Experimental results show that the bucket attack is very strong and destroys the entire watermark with close to 100% success rate. We examine the inherent weaknesses in the watermarking model of Gupta et al. that leave it vulnerable to the bucket attack and propose potential safeguards that can provide resilience against this attack.
Resumo:
A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti’s reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map.
Resumo:
This study considers the role and nature of co-thought gestures when students process map-based mathematics tasks. These gestures are typically spontaneously produced silent gestures which do not accompany speech and are represented by small movements of the hands or arms often directed toward an artefact. The study analysed 43 students (aged 10–12 years) over a 3-year period as they solved map tasks that required spatial reasoning. The map tasks were representative of those typically found in mathematics classrooms for this age group and required route finding and coordinate knowledge. The results indicated that co-thought gestures were used to navigate the problem space and monitor movements within the spatial challenges of the respective map tasks. Gesturing was most influential when students encountered unfamiliar tasks or when they found the tasks spatially demanding. From a teaching and learning perspective, explicit co-thought gesturing highlights cognitive challenges students are experiencing since students tended to not use gesturing in tasks where the spatial demands were low.
Resumo:
Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.