894 resultados para Standardization in robotics


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the pressure continues to grow on Diamond and the world's synchrotrons for higher throughput of diffraction experiments, new and novel techniques are required for presenting micron dimension crystals to the X ray beam. Currently this task is both labour intensive and primarily a serial process. Diffraction measurements typically take milliseconds but sample preparation and presentation can reduce throughput down to 4 measurements an hour. With beamline waiting times as long as two years it is of key importance for researchers to capitalize on available beam time, generating as much data as possible. Other approaches detailed in the literature [1] [2] [3] are very much skewed towards automating, with robotics, the actions of a human protocols. The work detailed here is the development and discussion of a bottom up approach relying on SSAW self assembly, including material selection, microfluidic integration and tuning of the acoustic cavity to order the protein crystals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few symbols of 1950s-1960s America remain as central to our contemporary conception of Cold War culture as the iconic ranch-style suburban home. While the house took center stage in the Nixon/Khrushchev kitchen debates as a symbol of modern efficiency and capitalist values, its popularity depended largely upon its obvious appropriation of vernacular architecture from the 19th century, those California haciendas and Texas dogtrots that dotted the American west. Contractors like William Levitt modernized the historical common houses, hermetically sealing their porous construction, all while using the ranch-style roots of the dwelling to galvanize a myth of an indigenous American culture. At a moment of intense occupational bureaucracy, political uncertainty and atomized social life, the rancher gave a self-identifying white consumer base reason to believe they could master their own plot in the expansive frontier. Only one example of America’s mid-century love affair with commodified vernacular forms, the ranch-style home represents a broad effort on the part of corporate and governmental interest groups to transform the vernacular into a style that expresses a distinctly homogenous vision of American culture. “Other than a Citizen” begins with an anatomy of that transformation, and then turns to the work of four poets who sought to reclaim the vernacular from that process of standardization and use it to countermand the containment-era strategies of Cold War America.

In four chapters, I trace references to common speech and verbal expressivity in the poetry and poetic theory of Charles Olson, Robert Duncan, LeRoi Jones/Amiri Baraka and Gwendolyn Brooks, against the historical backdrop of the Free-Speech Movement and the rise of mass-culture. When poets frame nonliterary speech within the literary page, they encounter the inability of writing to capture the vital ephemerality of verbal expression. Rather than treat this limitation as an impediment, the writers in my study use the poem to dramatize the fugitivity of speech, emphasizing it as a disruptive counterpoint to the technologies of capture. Where critics such as Houston Baker interpret the vernacular strictly in terms of resistance, I take a cue from the poets and argue that the vernacular, rooted etymologically at the intersection of domestic security and enslaved margin, represents a gestalt form, capable at once of establishing centralized power and sparking minor protest. My argument also expands upon Michael North’s exploration of the influence of minstrelsy and regionalism on the development of modernist literary technique in The Dialect of Modernism. As he focuses on writers from the early 20th century, I account for the next generation, whose America was not a culturally inferior collection of immigrants but an imperial power, replete with economic, political and artistic dominance. Instead of settling for an essentially American idiom, the poets in my study saw in the vernacular not phonetic misspellings, slang terminology and fragmented syntax, but the potential to provoke and thereby frame a more ethical mode of social life, straining against the regimentation of citizenship.

My attention to the vernacular argues for an alignment among writers who have been segregated by the assumption that race and aesthetics are mutually exclusive categories. In reading these writers alongside one another, “Other than a Citizen” shows how the avant-garde concepts of projective poetics and composition by field develop out of an interest in black expressivity. Conversely, I trace black radicalism and its emphasis on sociality back to the communalism practiced at the experimental arts college in Black Mountain, North Carolina, where Olson and Duncan taught. In pressing for this connection, my work reveals the racial politics embedded within the speech-based aesthetics of the postwar era, while foregrounding the aesthetic dimension of militant protest.

Not unlike today, the popular rhetoric of the Cold War insists that to be a citizen involves defending one’s status as a rightful member of an exclusionary nation. To be other than a citizen, as the poets in my study make clear, begins with eschewing the false certainty that accompanies categorical nominalization. In promoting a model of mutually dependent participation, these poets lay the groundwork for an alternative model of civic belonging, where volition and reciprocity replace compliance and self-sufficiency. In reading their lines, we become all the more aware of the cracks that run the length of our load-bearing walls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a solution to part of the problem of making robotic or semi-robotic digging equipment less dependant on human supervision. A method is described for identifying rocks of a certain size that may affect digging efficiency or require special handling. The process involves three main steps. First, by using range and intensity data from a time-of-flight (TOF) camera, a feature descriptor is used to rank points and separate regions surrounding high scoring points. This allows a wide range of rocks to be recognized because features can represent a whole or just part of a rock. Second, these points are filtered to extract only points thought to belong to the large object. Finally, a check is carried out to verify that the resultant point cloud actually represents a rock. Results are presented from field testing on piles of fragmented rock. Note to Practitioners—This paper presents an algorithm to identify large boulders in a pile of broken rock as a step towards an autonomous mining dig planner. In mining, piles of broken rock can contain large fragments that may need to be specially handled. To assess rock piles for excavation, we make use of a TOF camera that does not rely on external lighting to generate a point cloud of the rock pile. We then segment large boulders from its surface by using a novel feature descriptor and distinguish between real and false boulder candidates. Preliminary field experiments show promising results with the algorithm performing nearly as well as human test subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is a conceptual examination of the positions from which we teach in public education. As it is philosophical in nature, it takes no qualitative or quantitative data. It offers a review of selected relevant literature and an analysis of personal and professional experience, with the intent to pose critical questions about teaching and learning. The framework of this thesis represents the following contentions: First, from its inception, public schooling served capital by preparing skilled labour for emerging industrial markets. This history is the hegemonic shadow that hangs over public education today. Second, movements toward the standardization of funding, curriculum, and evaluation support the further commodification of public schooling. The “accountability” that standardization offers, the “back to basics” that it aims for, is counter to the potential that public education might critically inform citizens and seek social justice. Third, movements toward the privatization of public schooling under the guise of “choice” and “mobility”, brought on by manufactured crisis, serve only to widen socio-economic inequities as capitalist neoliberal interests seek profit in both the product of public schools and in schooling itself. If we recognize and understand the power of public education to inform vast numbers of citizens who will, in turn, either maintain or reform society, we must ask: What do we want public education to be? What are the effects of continuing down historically conventional and increasingly standardized paths? What do progressive pedagogies offer? How might teachers destandardize their pedagogy and pursue equitable opportunities for marginalized students? How might students name themselves and their world, that they might play a part in its reimagining? For whom do we teach, and under what conditions? From where do we teach, and why? For educators to ask these questions, and to employ what they discover, will necessitate taking substantial risks. It will necessitate taking a stand and cannot be done alone. Teachers must seek out the collaboration of their students. They must offer students the time and the space to find their own voices, to create their own selves, and to envision previously uncharted paths on which we might walk together.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The map representation of an environment should be selected based on its intended application. For example, a geometrically accurate map describing the Euclidean space of an environment is not necessarily the best choice if only a small subset its features are required. One possible subset is the orientations of the flat surfaces in the environment, represented by a special parameterization of normal vectors called axes. Devoid of positional information, the entries of an axis map form a non-injective relationship with the flat surfaces in the environment, which results in physically distinct flat surfaces being represented by a single axis. This drastically reduces the complexity of the map, but retains important information about the environment that can be used in meaningful applications in both two and three dimensions. This thesis presents axis mapping, which is an algorithm that accurately and automatically estimates an axis map of an environment based on sensor measurements collected by a mobile platform. Furthermore, two major applications of axis maps are developed and implemented. First, the LiDAR compass is a heading estimation algorithm that compares measurements of axes with an axis map of the environment. Pairing the LiDAR compass with simple translation measurements forms the basis for an accurate two-dimensional localization algorithm. It is shown that this algorithm eliminates the growth of heading error in both indoor and outdoor environments, resulting in accurate localization over long distances. Second, in the context of geotechnical engineering, a three-dimensional axis map is called a stereonet, which is used as a tool to examine the strength and stability of a rock face. Axis mapping provides a novel approach to create accurate stereonets safely, rapidly, and inexpensively compared to established methods. The non-injective property of axis maps is leveraged to probabilistically describe the relationships between non-sequential measurements of the rock face. The automatic estimation of stereonets was tested in three separate outdoor environments. It is shown that axis mapping can accurately estimate stereonets while improving safety, requiring significantly less time and effort, and lowering costs compared to traditional and current state-of-the-art approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In geotechnical engineering, the stability of rock excavations and walls is estimated by using tools that include a map of the orientations of exposed rock faces. However, measuring these orientations by using conventional methods can be time consuming, sometimes dangerous, and is limited to regions of the exposed rock that are reachable by a human. This thesis introduces a 2D, simulated, quadcopter-based rock wall mapping algorithm for GPS denied environments such as underground mines or near high walls on surface. The proposed algorithm employs techniques from the field of robotics known as simultaneous localization and mapping (SLAM) and is a step towards 3D rock wall mapping. Not only are quadcopters agile, but they can hover. This is very useful for confined spaces such as underground or near rock walls. The quadcopter requires sensors to enable self localization and mapping in dark, confined and GPS denied environments. However, these sensors are limited by the quadcopter payload and power restrictions. Because of these restrictions, a light weight 2D laser scanner is proposed. As a first step towards a 3D mapping algorithm, this thesis proposes a simplified scenario in which a simulated 1D laser range finder and 2D IMU are mounted on a quadcopter that is moving on a plane. Because the 1D laser does not provide enough information to map the 2D world from a single measurement, many measurements are combined over the trajectory of the quadcopter. Least Squares Optimization (LSO) is used to optimize the estimated trajectory and rock face for all data collected over the length of a light. Simulation results show that the mapping algorithm developed is a good first step. It shows that by combining measurements over a trajectory, the scanned rock face can be estimated using a lower-dimensional range sensor. A swathing manoeuvre is introduced as a way to promote loop closures within a short time period, thus reducing accumulated error. Some suggestions on how to improve the algorithm are also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nanotechnology is a multidisciplinary science that is having a boom today, providing new products with attractive physicochemical properties for many applications. In agri/feed/food sector, nanotechnology offers great opportunities for obtaining products and innovative applications for agriculture and livestock, water treatment and the production, processing, storage and packaging of food. To this end, a wide variety of nanomaterials, ranging from metals and inorganic metal oxides to organic nanomaterials carrying bioactive ingredients are applied. This review shows an overview of current and future applications of nanotechnology in the food industry. Food additives and materials in contact with food are now the main applications, while it is expected that in the future are in the field of nano-encapsulated and nanocomposites in applications as novel foods, additives, biocides, pesticides and materials food contact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning Analytics is an emerging field focused on analyzing learners’ interactions with educational content. One of the key open issues in learning analytics is the standardization of the data collected. This is a particularly challenging issue in serious games, which generate a diverse range of data. This paper reviews the current state of learning analytics, data standards and serious games, studying how serious games are tracking the interactions from their players and the metrics that can be distilled from them. Based on this review, we propose an interaction model that establishes a basis for applying Learning Analytics into serious games. This paper then analyzes the current standards and specifications used in the field. Finally, it presents an implementation of the model with one of the most promising specifications: Experience API (xAPI). The Experience API relies on Communities of Practice developing profiles that cover different use cases in specific domains. This paper presents the Serious Games xAPI Profile: a profile developed to align with the most common use cases in the serious games domain. The profile is applied to a case study (a demo game), which explores the technical practicalities of standardizing data acquisition in serious games. In summary, the paper presents a new interaction model to track serious games and their implementation with the xAPI specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity of modern geochemical data sets is increasing in several aspects (number of available samples, number of elements measured, number of matrices analysed, geological-environmental variability covered, etc), hence it is becoming increasingly necessary to apply statistical methods to elucidate their structure. This paper presents an exploratory analysis of one such complex data set, the Tellus geochemical soil survey of Northern Ireland (NI). This exploratory analysis is based on one of the most fundamental exploratory tools, principal component analysis (PCA) and its graphical representation as a biplot, albeit in several variations: the set of elements included (only major oxides vs. all observed elements), the prior transformation applied to the data (none, a standardization or a logratio transformation) and the way the covariance matrix between components is estimated (classical estimation vs. robust estimation). Results show that a log-ratio PCA (robust or classical) of all available elements is the most powerful exploratory setting, providing the following insights: the first two processes controlling the whole geochemical variation in NI soils are peat coverage and a contrast between “mafic” and “felsic” background lithologies; peat covered areas are detected as outliers by a robust analysis, and can be then filtered out if required for further modelling; and peat coverage intensity can be quantified with the %Br in the subcomposition (Br, Rb, Ni).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PCR-based immunoglobulin (Ig)/T-cell receptor (TCR) clonality testing in suspected lymphoproliferations has largely been standardized and has consequently become technically feasible in a routine diagnostic setting. Standardization of the pre-analytical and post-analytical phases is now essential to prevent misinterpretation and incorrect conclusions derived from clonality data. As clonality testing is not a quantitative assay, but rather concerns recognition of molecular patterns, guidelines for reliable interpretation and reporting are mandatory. Here, the EuroClonality (BIOMED-2) consortium summarizes important pre- and post-analytical aspects of clonality testing, provides guidelines for interpretation of clonality testing results, and presents a uniform way to report the results of the Ig/TCR assays. Starting from an immunobiological concept, two levels to report Ig/TCR profiles are discerned: the technical description of individual (multiplex) PCR reactions and the overall molecular conclusion for B and T cells. Collectively, the EuroClonality (BIOMED-2) guidelines and consensus reporting system should help to improve the general performance level of clonality assessment and interpretation, which will directly impact on routine clinical management (standardized best-practice) in patients with suspected lymphoproliferations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provenance is a record that describes the people, institutions, entities, and activities, involved in producing, influencing, or delivering a piece of data or a thing in the world. Some 10 years after beginning research on the topic of provenance, I co-chaired the provenance working group at the World Wide Web Consortium. The working group published the PROV standard for provenance in 2013. In this talk, I will present some use cases for provenance, the PROV standard and some flagship examples of adoption. I will then move on to our current research area aiming to exploit provenance, in the context of the Sociam, SmartSociety, ORCHID projects. Doing so, I will present techniques to deal with large scale provenance, to build predictive models based on provenance, and to analyse provenance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper outlines the development of a crosscorrelation algorithm and a spiking neural network (SNN) for sound localisation based on real sound recorded in a noisy and dynamic environment by a mobile robot. The SNN architecture aims to simulate the sound localisation ability of the mammalian auditory pathways by exploiting the binaural cue of interaural time difference (ITD). The medial superior olive was the inspiration for the SNN architecture which required the integration of an encoding layer which produced biologically realistic spike trains, a model of the bushy cells found in the cochlear nucleus and a supervised learning algorithm. The experimental results demonstrate that biologically inspired sound localisation achieved using a SNN can compare favourably to the more classical technique of cross-correlation.