921 resultados para Means-end approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For at least two millennia and probably much longer, the traditional vehicle for communicating geographical information to end-users has been the map. With the advent of computers, the means of both producing and consuming maps have radically been transformed, while the inherent nature of the information product has also expanded and diversified rapidly. This has given rise in recent years to the new concept of geovisualisation (GVIS), which draws on the skills of the traditional cartographer, but extends them into three spatial dimensions and may also add temporality, photorealistic representations and/or interactivity. Demand for GVIS technologies and their applications has increased significantly in recent years, driven by the need to study complex geographical events and in particular their associated consequences and to communicate the results of these studies to a diversity of audiences and stakeholder groups. GVIS has data integration, multi-dimensional spatial display advanced modelling techniques, dynamic design and development environments and field-specific application needs. To meet with these needs, GVIS tools should be both powerful and inherently usable, in order to facilitate their role in helping interpret and communicate geographic problems. However no framework currently exists for ensuring this usability. The research presented here seeks to fill this gap, by addressing the challenges of incorporating user requirements in GVIS tool design. It starts from the premise that usability in GVIS should be incorporated and implemented throughout the whole design and development process. To facilitate this, Subject Technology Matching (STM) is proposed as a new approach to assessing and interpreting user requirements. Based on STM, a new design framework called Usability Enhanced Coordination Design (UECD) is ten presented with the purpose of leveraging overall usability of the design outputs. UECD places GVIS experts in a new key role in the design process, to form a more coordinated and integrated workflow and a more focused and interactive usability testing. To prove the concept, these theoretical elements of the framework have been implemented in two test projects: one is the creation of a coastal inundation simulation for Whitegate, Cork, Ireland; the other is a flooding mapping tool for Zhushan Town, Jiangsu, China. The two case studies successfully demonstrated the potential merits of the UECD approach when GVIS techniques are applied to geographic problem solving and decision making. The thesis delivers a comprehensive understanding of the development and challenges of GVIS technology, its usability concerns, usability and associated UCD; it explores the possibility of putting UCD framework in GVIS design; it constructs a new theoretical design framework called UECD which aims to make the whole design process usability driven; it develops the key concept of STM into a template set to improve the performance of a GVIS design. These key conceptual and procedural foundations can be built on future research, aimed at further refining and developing UECD as a useful design methodology for GVIS scholars and practitioners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis argues that through the prism of America’s Cold War, scientism has emerged as the metanarrative of the postnuclear age. The advent of the bomb brought about a new primacy for mechanical and hyperrational thinking in the corridors of power not just in terms of managing the bomb itself but diffusing this ideology throughout the culture in social sciences, economics and other such institutional systems. The human need to mitigate or ameliorate against the chaos of the universe lies at the heart of not just religious faith but in the desire for perfect control. Thus there has been a transference of power from religious faith to the apparent material power of science and technology and the terra firma these supposedly objective means supply. The Cold War, however was a highly ideologically charged opposition between the two superpowers, and the scientific methodology that sprang forth to manage the Cold War and the bomb, in the United States, was not an objective scientific system divorced from the paranoia and dogma but a system that assumed a radically fundamentalist idea of capitalism. This is apparent in the widespread diffusion of game theory throughout Western postindustrial institutions. The inquiry of the thesis thus examines the texts that engage and criticise American Cold War methodology, beginning with the nuclear moment, so to speak, and Dr Strangelove’s incisive satire of moral abdication to machine processes. Moving on chronologically, the thesis examines the diffusion of particular kinds of masculinity and sexuality in postnuclear culture in Crash and End Zone and finishing up its analysis with the ethnographic portrayal of a modern American city in The Wire. More than anything else, the thesis wishes to reveal to what extent this technocratic consciousness puts pressure on language and on binding narratives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The core of this thesis is the study of NATO’s Comprehensive Approach strategy to state building in Afghanistan between 2006 and 2011. It argues that this strategy sustained operational and tactical practices which were ineffective in responding to the evolved nature of the security problem. The thesis interrogates the Comprehensive Approach along ontological, empirical and epistemological lines and concludes that the failure of the Comprehensive Approach in the specific Afghan case is, in fact, indicative of underlying theoretical and pragmatic flaws which, therefore, generalize the dilemma. The research is pragmatic in nature, employing mixed methods (quantitative and qualitative) concurrently. Qualitative methods include research into primary and secondary literature sources supplemented with the author’s personal experiences in Afghanistan in 2008 and various NATO HQ and Canadian settings. Quantitative research includes an empirical case study focussing on NATO’s Afghan experience and its attempt at state building between 2006 and 2011. This study incorporates a historical review of NATO’s evolutionary involvement in Afghanistan incorporating the subject timeframe; offers an analysis of human development and governance related data mapped to expected outcomes of the Afghan National Development Strategy and NATO’s comprehensive campaign design; and interrogates the Comprehensive Approach strategy by means of an analysis of conceptual, institutional and capability gaps in the context of an integrated investigational framework. The results of the case study leads to an investigation of a series of research questions related to the potential impact of the failure of the Comprehensive Approach for NATO in Afghanistan and the limits of state building as a means of attaining security for the Alliance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mobile cloud computing paradigm can offer relevant and useful services to the users of smart mobile devices. Such public services already exist on the web and in cloud deployments, by implementing common web service standards. However, these services are described by mark-up languages, such as XML, that cannot be comprehended by non-specialists. Furthermore, the lack of common interfaces for related services makes discovery and consumption difficult for both users and software. The problem of service description, discovery, and consumption for the mobile cloud must be addressed to allow users to benefit from these services on mobile devices. This paper introduces our work on a mobile cloud service discovery solution, which is utilised by our mobile cloud middleware, Context Aware Mobile Cloud Services (CAMCS). The aim of our approach is to remove complex mark-up languages from the description and discovery process. By means of the Cloud Personal Assistant (CPA) assigned to each user of CAMCS, relevant mobile cloud services can be discovered and consumed easily by the end user from the mobile device. We present the discovery process, the architecture of our own service registry, and service description structure. CAMCS allows services to be used from the mobile device through a user's CPA, by means of user defined tasks. We present the task model of the CPA enabled by our solution, including automatic tasks, which can perform work for the user without an explicit request.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much of science progresses within the tight boundaries of what is often seen as a "black box". Though familiar to funding agencies, researchers and the academic journals they publish in, it is an entity that outsiders rarely get to peek into. Crowdfunding is a novel means that allows the public to participate in, as well as to support and witness advancements in science. Here we describe our recent crowdfunding efforts to sequence the Azolla genome, a little fern with massive green potential. Crowdfunding is a worthy platform not only for obtaining seed money for exploratory research, but also for engaging directly with the general public as a rewarding form of outreach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A defect equation for the coupling of nonlinear subproblems defined in nonoverlapped subdomains arise in domain decomposition methods is presented. Numerical solutions of defect equations by means of quasi-Newton methods are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This presentation reports on the formal evaluation, through questionnaires, of a new Level 1 undergraduate course, for 130 student teachers, that uses blended learning. The course design seeks to radicalise the department’s approach to teaching, learning and assessment and use students as change agents. Its structure and content, model social constructivist approaches to learning. Building on the student’s experiences of and, reflections on, previous learning, promotes further learning through the support of “able others” (Vygotsky 1978), facilitating and nurturing a secure community of practice for students new to higher education. The course’s design incorporates individual, paired, small and large group activities and exploits online video, audio and text materials. Course units begin and end with face-to-face tutor-led activities. Online elements, including discussions and formative submissions, are tutor-mediated. Students work together face-to-face and online to read articles, write reflections, develop presentations, research and share experiences and resources. Summative joint assignments and peer assessments emphasise the value of collaboration and teamwork for academic, personal and professional development. Initial informal findings are positive, indicating that students have engaged readily with course content and structure, with few reporting difficulties accessing or using technology. Students have welcomed the opportunity to work together to tackle readings in a new genre, pilot presentation skills and receive and give constructive feedback to peers. Course tutors have indicated that depth and quality of study are evident, with regular online formative submissions enabling tutors to identify and engage directly with student’s needs, provide feedback and develop appropriately designed distance and face-to-face teaching materials. Pastoral tutors have indicated that students have reported non-engagement of peers, leading to the rapid application of academic or personal support. Outcomes of the formal evaluation will inform the development of Level 2 and 3 courses and influence the department’s use of blended learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RATIONALE & OBJECTIVES: The food multimix (FFM)concept states that limited food resources can be combined using scientific knowledge to meet nutrient needs of vulnerable groups at low cost utilizing the ‘nutrient strengths’ of individual or candidate foods in composite recipes within a cultural context. METHODS: The method employed the food-to-food approach for recipe development using traditional food ingredients. Recipes were subjected to proximate and micronutrient analysis and optimized to meet at tleast 40% of recommended daily intakes. End products including breads, porridge and soup were developed. RESULTS: FMM products were employed in a feeding trial among 120 healthy pregnant women in Gauteng, South Africa resulting in improvements in serum iron levels from baseline values of 14.59 (=/-7.67) umol/L and 14.02 (=/-8.13) umol/L for control and intervention groups (p=0.71), to 16.03 (=/-5.67) umol/L and 18.66 (=/-9.41) umol/L (p=0.19). The increases from baseline to post-intervention were however statistically significant within groups. Similarly Mean Cell Volume values improved from baseline as well as serum ferritin and transferritin levels. CONCLUSION: The FMM concept has potential value in feeding programs for vulnerable groups including pregnant and lactating mothers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review examines interregional linkages and gives an overview perspective on marine ecosystem functioning in the north-eastern Atlantic. It is based on three of the 'systems' considered by the European Network of Excellence for Ocean Ecosystems Analysis (EUR-OC EANS was established in 2004 under the European Framework VI funding programme to promote integration of marine ecological research within Europe), the Arctic and Nordic Seas, North Atlantic shelf seas and North Atlantic. The three systems share common open boundaries and the transport of water, heat, nutrients and particulates across these boundaries modifies local processes. Consistent with the EUR-OC EANS concept of 'end-to-end' analyses of marine food webs, the review takes an integrated approach linking ocean physics, lower trophic levels and working up the food web to top predators such as marine mammals. We begin with an overview of the regions focusing on the major physical patterns and their implications for the microbial community, phytoplankton, zooplankton, fish and top predators. Human-induced links between the regional systems are then considered and finally possible changes in the regional linkages over the next century are discussed. Because of the scale of potential impacts of climate change, this issue is considered in a separate section. The review demonstrates that the functioning of the ecosystems in each of the regions cannot be considered in isolation and the role of the atmosphere and ocean currents in linking the North Atlantic Ocean, North Atlantic shelf seas and the Arctic and Nordic Seas must be taken into account. Studying the North Atlantic and associated shelf seas as an integrated 'basin-scale' system will be a key challenge for the early twenty-first century. This requires a multinational approach that should lead to improved ecosystem-based approaches to conservation of natural resources, the maintenance of biodiversity, and a better understanding of the key role of the north-eastern Atlantic in the global carbon cycle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of a quasi-stationary anticyclonic eddy within the southeastern Bay of Biscay (centred around 44°30′N-4°W) has been reported on various occasions in the bibliography. The analysis made in this study for the period 2003–2010, by using in situ and remote sensing measurements and model results shows that this mesoscale coherent structure is present almost every year from the end of winter-beginning of spring, to the beginning of fall. During this period it remains in an area limited to the east by the Landes Plateau, to the west by Le Danois Bank and Torrelavega canyon and to the northwest by the Jovellanos seamount. All the observations and analysis made in this contribution, suggest that this structure is generated between Capbreton and Torrelavega canyons. Detailed monitoring from in situ and remote sensing data of an anticyclonic quasi-stationary eddy, in 2008, shows the origin of this structure from a warm water current located around 43°42′N-3°30′W in mid-January. This coherent structure is monitored until August around the same area, where it has a marked influence on the Sea Level Anomaly, Sea Surface Temperature and surface Chlorophyll-a concentration. An eddy tracking method, applied to the outputs of a numerical model, shows that the model is able to reproduce this type of eddy, with similar 2D characteristics and lifetimes to that suggested by the observations and previous works. This is the case, for instance, of the simulated MAY04 eddy, which was generated in May 2004 around Torrelavega canyon and remained quasi-stationary in the area for 4 months. The diameter of this eddy ranged from 40 to 60 km, its azimuthal velocity was less than 20 cm s−1, its vertical extension reached 3000–3500 m depth during April and May and it was observed to interact with other coherent structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of a quasi-stationary anticyclonic eddy within the southeastern Bay of Biscay (centred around 44°30′N-4°W) has been reported on various occasions in the bibliography. The analysis made in this study for the period 2003–2010, by using in situ and remote sensing measurements and model results shows that this mesoscale coherent structure is present almost every year from the end of winter-beginning of spring, to the beginning of fall. During this period it remains in an area limited to the east by the Landes Plateau, to the west by Le Danois Bank and Torrelavega canyon and to the northwest by the Jovellanos seamount. All the observations and analysis made in this contribution, suggest that this structure is generated between Capbreton and Torrelavega canyons. Detailed monitoring from in situ and remote sensing data of an anticyclonic quasi-stationary eddy, in 2008, shows the origin of this structure from a warm water current located around 43°42′N-3°30′W in mid-January. This coherent structure is monitored until August around the same area, where it has a marked influence on the Sea Level Anomaly, Sea Surface Temperature and surface Chlorophyll-a concentration. An eddy tracking method, applied to the outputs of a numerical model, shows that the model is able to reproduce this type of eddy, with similar 2D characteristics and lifetimes to that suggested by the observations and previous works. This is the case, for instance, of the simulated MAY04 eddy, which was generated in May 2004 around Torrelavega canyon and remained quasi-stationary in the area for 4 months. The diameter of this eddy ranged from 40 to 60 km, its azimuthal velocity was less than 20 cm s−1, its vertical extension reached 3000–3500 m depth during April and May and it was observed to interact with other coherent structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides a summary of our studies on robust speech recognition based on a new statistical approach – the probabilistic union model. We consider speech recognition given that part of the acoustic features may be corrupted by noise. The union model is a method for basing the recognition on the clean part of the features, thereby reducing the effect of the noise on recognition. To this end, the union model is similar to the missing feature method. However, the two methods achieve this end through different routes. The missing feature method usually requires the identity of the noisy data for noise removal, while the union model combines the local features based on the union of random events, to reduce the dependence of the model on information about the noise. We previously investigated the applications of the union model to speech recognition involving unknown partial corruption in frequency band, in time duration, and in feature streams. Additionally, a combination of the union model with conventional noise-reduction techniques was studied, as a means of dealing with a mixture of known or trainable noise and unknown unexpected noise. In this paper, a unified review, in the context of dealing with unknown partial feature corruption, is provided into each of these applications, giving the appropriate theory and implementation algorithms, along with an experimental evaluation.