790 resultados para Digital mapping -- Case studies -- Congresses
Resumo:
These National Guidelines and Case Studies for Digital Modelling are the outcomes from one of a number of Building Information Modelling (BIM)-related projects undertaken by the CRC for Construction Innovation. Since the CRC opened its doors in 2001, the industry has seen a rapid increase in interest in BIM, and widening adoption. These guidelines and case studies are thus very timely, as the industry moves to model-based working and starts to share models in a new context called integrated practice. Governments, both federal and state, and in New Zealand are starting to outline the role they might take, so that in contrast to the adoption of 2D CAD in the early 90s, we ensure that a national, industry-wide benefit results from this new paradigm of working. Section 1 of the guidelines give us an overview of BIM: how it affects our current mode of working, what we need to do to move to fully collaborative model-based facility development. The role of open standards such as IFC is described as a mechanism to support new processes, and make the extensive design and construction information available to asset operators and managers. Digital collaboration modes, types of models, levels of detail, object properties and model management complete this section. It will be relevant for owners, managers and project leaders as well as direct users of BIM. Section 2 provides recommendations and guides for key areas of model creation and development, and the move to simulation and performance measurement. These are the more practical parts of the guidelines developed for design professionals, BIM managers, technical staff and ‘in the field’ workers. The guidelines are supported by six case studies including a summary of lessons learnt about implementing BIM in Australian building projects. A key aspect of these publications is the identification of a number of important industry actions: the need for BIMcompatible product information and a national context for classifying product data; the need for an industry agreement and setting process-for-process definition; and finally, the need to ensure a national standard for sharing data between all of the participants in the facility-development process.
Resumo:
OpenStreetMap se inició en 2004 y ha crecido de forma paralela a los proyectos de software libre hasta convertirse en el ejemplo más veterano y de mayor envergadura dentro de lo que se conoce como información geográfica voluntaria (VGI en su acrónimo inglés). El auge del uso de este tipo de datos deja, sin embargo, ciertas preguntas abiertas como por ejemplo: ¿Hasta qué punto son fiables los datos así obtenidos? ¿Cuál es su calidad? En el presente trabajo se ha realizado una comparación de los datos geográficos producidos por voluntarios dentro del proyecto de colaboración OpenStreetMap, con los datos producidos por instituciones y armonizados dentro del proyecto Cartociudad. La intención de la comparación es evaluar la calidad de los primeros respecto de los segundos. Para ello se ha definido el término de calidad cartográfica y se han evaluado los diferentes elementos de calidad cartográfica de OpenStreetMap: precisión espacial y de atributos, compleción, calidad temporal y consistencia lógica. El trabajo se realiza con los datos a dos niveles: municipio y/o provincia de Valencia. Los resultados de este análisis muestran que OpenStreetMap tiene una precisión posicional y temporal más que adecuada para usos geocodificación y cálculo de rutas. Sin embargo la heterogeneidad de la cobertura de datos y ciertas inconsistencias internas pueden comprometer su uso. A pesar de ello, se destaca el potencial del proyecto y de una solución de cálculo de rutas óptimas (OpenRouteService) que utiliza con éxito los datos de OpenStreetMap
Resumo:
Quantitative databases are limited to information identified as important by their creators, while databases containing natural language are limited by our ability to analyze large unstructured bodies of text. Leximancer is a tool that uses semantic mapping to develop concept maps from natural language. We have applied Leximancer to educational based pathology case notes to demonstrate how real patient records or databases of case studies could be analyzed to identify unique relationships. We then discuss how such analysis could be used to conduct quantitative analysis from databases such as the Coronary Heart Disease Database.
Resumo:
A collection of 60 case studies of the use of Creative Commons licensing in different sectors, including: music, social activism, film, visual arts, collecting, government, publishing and education.
Resumo:
The shift from 20th century mass communications media towards convergent media and Web 2.0 has raised the possibility of a renaissance of the public sphere, based around citizen journalism and participatory media culture. This paper will evaluate such claims both conceptually and empirically. At a conceptual level, it is noted that the question of whether media democratization is occurring depends in part upon how democracy is understood, with some critical differences in understandings of democracy, the public sphere and media citizenship. The empirical work in this paper draws upon various case studies of new developments in Australian media, including online-only newspapers, developments in public service media, and the rise of commercially based online alternative media. It is argued that participatory media culture is being expanded if understood in terms of media pluralism, but that implications for the public sphere depend in part upon how media democratization is defined.
Resumo:
There is a growing awareness worldwide of the significance of social media to communication in times of both natural and human-created disasters and crises. While the media have long been used as a means of broadcasting messages to communities in times of crisis – bushfires, floods, earthquakes etc. – the significance of social media in enabling many-to-many communication through ubiquitous networked computing and mobile media devices is becoming increasingly important in the fields of disaster and emergency management. This paper undertakes an analysis of the uses made of social media during two recent natural disasters: the January 2011 floods in Brisbane and South-East Queensland in Australia, and the February 2011 earthquake in Christchurch, New Zealand. It is part of a wider project being undertaken by a research team based at the Queensland University of Technology in Brisbane, Australia, that is working with the Queensland Department of Community Safety (DCS) and the EIDOS Institute, and funded by the Australian Research Council (ARC) through its Linkages program. The project combines large-scale, quantitative social media tracking and analysis techniques with qualitative cultural analysis of communication efforts by citizens and officials, to enable both emergency management authorities and news media organisations to develop, implement, and evaluate new social media strategies for emergency communication.
Resumo:
In Finland one of the most important current issues in the environmental management is the quality of surface waters. The increasing social importance of lakes and water systems has generated wide-ranging interest in lake restoration and management, concerning especially lakes suffering from eutrophication, but also from other environmental impacts. Most of the factors deteriorating the water quality in Finnish lakes are connected to human activities. Especially since the 1940's, the intensified farming practices and conduction of sewage waters from scattered settlements, cottages and industry have affected the lakes, which simultaneously have developed in to recreational areas for a growing number of people. Therefore, this study was focused on small lakes, which are human impacted, located close to settlement areas and have a significant value for local population. The aim of this thesis was to obtain information from lake sediment records for on-going lake restoration activities and to prove that a well planned, properly focused lake sediment study is an essential part of the work related to evaluation, target consideration and restoration of Finnish lakes. Altogether 11 lakes were studied. The study of Lake Kaljasjärvi was related to the gradual eutrophication of the lake. In lakes Ormajärvi, Suolijärvi, Lehee, Pyhäjärvi and Iso-Roine the main focus was on sediment mapping, as well as on the long term changes of the sedimentation, which were compared to Lake Pääjärvi. In Lake Hormajärvi the role of different kind of sedimentation environments in the eutrophication development of the lake's two basins were compared. Lake Orijärvi has not been eutrophied, but the ore exploitation and related acid main drainage from the catchment area have influenced the lake drastically and the changes caused by metal load were investigated. The twin lakes Etujärvi and Takajärvi are slightly eutrophied, but also suffer problems associated with the erosion of the substantial peat accumulations covering the fringe areas of the lakes. These peat accumulations are related to Holocene water level changes, which were investigated. The methods used were chosen case-specifically for each lake. In general, acoustic soundings of the lakes, detailed description of the nature of the sediment and determinations of the physical properties of the sediment, such as water content, loss on ignition and magnetic susceptibility were used, as was grain size analysis. A wide set of chemical analyses was also used. Diatom and chrysophycean cyst analyses were applied, and the diatom inferred total phosphorus content was reconstructed. The results of these studies prove, that the ideal lake sediment study, as a part of a lake management project, should be two-phased. In the first phase, thoroughgoing mapping of sedimentation patterns should be carried out by soundings and adequate corings. The actual sampling, based on the preliminary results, must include at least one long core from the main sedimentation basin for the determining the natural background state of the lake. The recent, artificially impacted development of the lake can then be determined by short-core and surface sediment studies. The sampling must be focused on the basis of the sediment mapping again, and it should represent all different sedimentation environments and bottom dynamic zones, considering the inlets and outlets, as well as the effects of possible point loaders of the lake. In practice, the budget of the lake management projects of is usually limited and only the most essential work and analyses can be carried out. The set of chemical and biological analyses and dating methods must therefore been thoroughly considered and adapted to the specific management problem. The results show also, that information obtained from a properly performed sediment study enhances the planning of the restoration, makes possible to define the target of the remediation activities and improves the cost-efficiency of the project.
Resumo:
When performing in opera, a singer portrays a character. A libretto is used as the principal resource for the research. Music can also reveal insights into the composer’s ideas regarding characterization. This performance dissertation examines how musical devices such as genre, texture, meter, melody, instrumentation and form can be used to inform choices of characterization. Three roles from diverse operas were examined and performed. The first role, Estelle Oglethorpe in Later the Same Evening (2007) by John Musto (b 1954) was performed November 15, 16, 17, 18 2007. The second role, Dorabella in Così fan tutte (1789) by Wolfgang Amadeus Mozart (1756-1791) was performed April 20, 25, 27, 2008. The third role, Olga in Eugene Onegin (1878) by Pyotr Ilyich Tchaikovsky (1840-1893) was performed on April 19, 2009. All operas were presented by the University of Maryland Opera Studio at the Ina and Jack Kay Theater in the Clarice Smith Performing Arts Center, University of Maryland College Park. DVD recordings of all performances can be found in the University of Maryland library system.
Resumo:
Online information seeking has become normative practice among both academics and the general population. This study appraised the performance of eight databases to retrieve research pertaining to the influence of social networking sites on the mental health of young people. A total of 43 empirical studies on young people’s use of social networking sites and the mental health implications were retrieved. Scopus and SSCI had the highest sensitivity with PsycINFO having the highest precision. Effective searching requires large
generic databases, supplemented by subject-specific catalogues. The methodology developed here may provide inexperienced searchers, such as undergraduate students, with a framework to define a realistic scale of searching to undertake for a particular literature review or similar project.
Structuring and moodleing a course: case studies at the polytechnic of Porto - School of engineering
Resumo:
This work presents a comparative study covering four different courses lectured at the Polytechnic of Porto - School of Engineering, in respect to the usage of a particular Learning Management System, i.e. Moodle, and its impact on students' results. Even though positive correlation factors exist, e.g. between the number of Moodle accesses versus the final exam grade obtained by each student, the explanation behind it may not be straightforward. Mapping this particular factor to course numbers reveals that the quality of the resources might be preponderant and not only their quantity. This paper also addresses teachers who used this platform as a complement to their courses (b-learning) and identifies some particular issues they should be aware in order to potentiate students' engagement and learning.
Resumo:
This paper presents a study documenting the general trends in the programming techniques, aided behavioral thresholds, speech perception abilities, and overall behavior when converting children into processing strategy called HiResolution (HiRes), used with the Advanced Bionics Clarion II Cochlear Implant System.
Resumo:
Education, especially higher education, is considered vital for maintaining national and individual competitiveness in the global knowledge economy. Following the introduction of its “Free Education Policy” as early as 1947, Sri Lanka is now the best performer in basic education in the South Asian region, with a remarkable record in terms of high literacy rates and the achievement of universal primary education. However, access to tertiary education is a bottleneck, due to an acute shortage of university places. In an attempt to address this problem, the government of Sri Lanka has invested heavily in information and communications technologies (ICTs) for distance education. Although this has resulted in some improvement, the authors of this article identify several barriers which are still impeding successful participation for the majority of Sri Lankans wanting to study at tertiary level. These impediments include the lack of infrastructure/resources, low English language proficiency, weak digital literacy, poor quality of materials and insufficient provision of student support. In the hope that future implementations of ICT-enabled education programmes can avoid repeating the mistakes identified by their research in this Sri Lankan case, the authors conclude their paper with a list of suggested policy options.