939 resultados para Qualitative case study
Resumo:
Mapping the shear wave velocity profile is an important part in seismic hazard and microzonation studies. The shear wave velocity of soil in the city of Bangalore was mapped using the Multichannel Analysis of Surface Wave (MASW) technique. An empirical relationship was found between the Standard Penetration Test (SPT) corrected N value ((N1)60cs) and measured shear wave velocity (Vs). The survey points were selected in such a way that the results represent the entire Bangalore region, covering an area of 220 km2. Fifty-eight 1-D and 20 2-D MASW surveys were performed and their velocity profiles determined. The average shear wave velocity of Bangalore soils was evaluated for depths of 5 m, 10 m, 15 m, 20 m, 25 m and 30 m. The sub-soil classification was made for seismic local site effect evaluation based on average shear wave velocity of 30-m depth (Vs30) of sites using the National Earthquake Hazards Reduction Program (NEHRP) and International Building Code (IBC) classification. Mapping clearly indicates that the depth of soil obtained from MASW closely matches with the soil layers identified in SPT bore holes. Estimation of local site effects for an earthquake requires knowledge of the dynamic properties of soil, which is usually expressed in terms of shear wave velocity. Hence, to make use of abundant SPT data available on many geotechnical projects in Bangalore, an attempt was made to develop a relationship between Vs (m/s) and (N1)60cs. The measured shear wave velocity at 38 locations close to SPT boreholes was used to generate the correlation between the corrected N values and shear wave velocity. A power fit model correlation was developed with a regression coefficient (R2) of 0.84. This relationship between shear wave velocity and corrected SPT N values correlates well with the Japan Road Association equations.
Resumo:
The current approach for protecting the receiving water environment from urban stormwater pollution is the adoption of structural measures commonly referred to as Water Sensitive Urban Design (WSUD). The treatment efficiency of WSUD measures closely depends on the design of the specific treatment units. As stormwater quality is influenced by rainfall characteristics, the selection of appropriate rainfall events for treatment design is essential to ensure the effectiveness of WSUD systems. Based on extensive field investigations in four urban residential catchments based at Gold Coast, Australia, and computer modelling, this paper details a technically robust approach for the selection of rainfall events for stormwater treatment design using a three-component model. The modelling results confirmed that high intensity-short duration events produce 58.0% of TS load while they only generated 29.1% of total runoff volume. Additionally, rainfall events smaller than 6-month average recurrence interval (ARI) generates a greater cumulative runoff volume (68.4% of the total annual runoff volume) and TS load (68.6% of the TS load exported) than the rainfall events larger than 6-month ARI. The results suggest that for the study catchments, stormwater treatment design could be based on the rainfall which had a mean value of 31 mm/h average intensity and 0.4 h duration. These outcomes also confirmed that selecting smaller ARI rainfall events with high intensity-short duration as the threshold for treatment system design is the most feasible approach since these events cumulatively generate a major portion of the annual pollutant load compared to the other types of events, despite producing a relatively smaller runoff volume. This implies that designs based on small and more frequent rainfall events rather than larger rainfall events would be appropriate in the context of efficiency in treatment performance, cost-effectiveness and possible savings in land area needed.
Resumo:
Increasing numbers of medical schools in Australia and overseas have moved away from didactic teaching methodologies and embraced problem-based learning (PBL) to improve clinical reasoning skills and communication skills as well as to encourage self-directed lifelong learning. In January 2005, the first cohort of students entered the new MBBS program at the Griffith University School of Medicine, Gold Coast, to embark upon an exciting, fully integrated curriculum using PBL, combining electronic delivery, communication and evaluation systems incorporating cognitive principles that underpin the PBL process. This chapter examines the educational philosophies and design of the e-learning environment underpinning the processes developed to deliver, monitor and evaluate the curriculum. Key initiatives taken to promote student engagement and innovative and distinctive approaches to student learning at Griffith promoted within the conceptual model for the curriculum are (a) Student engagement, (b) Pastoral care, (c) Staff engagement, (d) Monitoring and (e) Curriculum/Program Review. © 2007 Springer-Verlag Berlin Heidelberg.
Resumo:
An estimate of the groundwater budget at the catchment scale is extremely important for the sustainable management of available water resources. Water resources are generally subjected to over-exploitation for agricultural and domestic purposes in agrarian economies like India. The double water-table fluctuation method is a reliable method for calculating the water budget in semi-arid crystalline rock areas. Extensive measurements of water levels from a dense network before and after the monsoon rainfall were made in a 53 km(2)atershed in southern India and various components of the water balance were then calculated. Later, water level data underwent geostatistical analyses to determine the priority and/or redundancy of each measurement point using a cross-validation method. An optimal network evolved from these analyses. The network was then used in re-calculation of the water-balance components. It was established that such an optimized network provides far fewer measurement points without considerably changing the conclusions regarding groundwater budget. This exercise is helpful in reducing the time and expenditure involved in exhaustive piezometric surveys and also in determining the water budget for large watersheds (watersheds greater than 50 km(2)).
Resumo:
We present a case study of formal verification of full-wave rectifier for analog and mixed signal designs. We have used the Checkmate tool from CMU [1], which is a public domain formal verification tool for hybrid systems. Due to the restriction imposed by Checkmate it necessitates to make the changes in the Checkmate implementation to implement the complex and non-linear system. Full-wave rectifier has been implemented by using the Checkmate custom blocks and the Simulink blocks from MATLAB from Math works. After establishing the required changes in the Checkmate implementation we are able to efficiently verify, the safety properties of the full-wave rectifier.
Resumo:
Computational modelling of mechanisms underlying processes in the real world can be of great value in understanding complex biological behaviours. Uptake in general biology and ecology has been rapid. However, it often requires specific data sets that are overly costly in time and resources to collect. The aim of the current study was to test whether a generic behavioural ecology model constructed using published data could give realistic outputs for individual species. An individual-based model was developed using the Pattern-Oriented Modelling (POM) strategy and protocol, based on behavioural rules associated with insect movement choices. Frugivorous Tephritidae (fruit flies) were chosen because of economic significance in global agriculture and the multiple published data sets available for a range of species. The Queensland fruit fly (Qfly), Bactrocera tryoni, was identified as a suitable individual species for testing. Plant canopies with modified architecture were used to run predictive simulations. A field study was then conducted to validate our model predictions on how plant architecture affects fruit flies’ behaviours. Characteristics of plant architecture such as different shapes, e.g., closed-canopy and vase-shaped, affected fly movement patterns and time spent on host fruit. The number of visits to host fruit also differed between the edge and centre in closed-canopy plants. Compared to plant architecture, host fruit has less contribution to effects on flies’ movement patterns. The results from this model, combined with our field study and published empirical data suggest that placing fly traps in the upper canopy at the edge should work best. Such a modelling approach allows rapid testing of ideas about organismal interactions with environmental substrates in silico rather than in vivo, to generate new perspectives. Using published data provides a saving in time and resources. Adjustments for specific questions can be achieved by refinement of parameters based on targeted experiments.
Resumo:
This study examines gendered housework in India, particularly in Bihar. The perspective adopted in the study was in part derived from the data but also from sociological literature published both in Western countries and in India. The primary attention is therefore paid to modern and traditional aspects in housework. The aim is not to compare Indian practices to those of Western societies, but rather to use Western studies as a fruitful reference point. In that light, Indian housework practices appear to be traditional. Consequently, traditions are given a more significant role than is usually the case in studies on gendered housework, particularly in Western countries. The study approaches the topic mainly from the socio-cultural perspective; this provides the best means to understand the persistence of traditional habits in India. To get a wide enough picture of the division of labour, three methods were applied in the study: detailed time-use data, questionnaire and theme interviews. The data were collected in 1988 in two districts of Bihar, one rural and the other urban. The different data complement each other well but also bring to light contradictory findings: on a general level Biharian people express surprisingly modern views on gender equality but when talking in more detail (theme interviews) the interviewees told about how traditional housework practices still were in 1988. In the analysis of the data set four principal themes are discussed. Responsibility is the concept by which the study aims at understanding the logic of the argumentation on which the persistence of traditional housework practices is grounded. Contrary to the Western style, Biharian respondents appealed not to the principle of choice but to their responsibility to do what has to be done. The power of tradition, the early socialization of children to the traditional division of labour and the elusive nature of modernity are all discussed separately. In addition to the principle of responsibility, housework was also seen as an expression of affection. This was connected to housework in general but also to traditional practices. The purity principle was the third element that made Biharian interviewees favour housework in general, but as in the case of affection it too was interwoven with traditional practices. It seems to be so that if housework is in general preferred, this leads to preferring the traditional division of labour, too. The same came out when examining economic imperatives. However, the arguments concerning them proved to be rational. In analysing them it became clear that the significance of traditions is also much dependent on the economics: as far as the average income in India is very low, the prevalence of traditional practices in housework will continue. However, to make this work, cultural arguments are required: their role is to mediate more smoothly the iron rules of the economy. Key words: family, gendered housework, division of labour, responsibility, family togetherness, emotion, economy of housework, modernity, traditionality
Resumo:
This article explores the influence of cultural and religious beliefs and laws on how individuals make decisions about asset distribution through wills, drawing on a case study of Islamic will makers. Findings highlight diversity in beliefs and practices within Australian Islamic communities. When drafting a will people from culturally diverse backgrounds need to accommodate their religious and cultural values and local law. Implications of research findings for legal policy and practice in Australia are discussed.
Resumo:
The Clean Development Mechanism (CDM), Article 12 of the Kyoto Protocol allows Afforestation and Reforestation (A/R) projects as mitigation activities to offset the CO2 in the atmosphere whilst simultaneously seeking to ensure sustainable development for the host country. The Kyoto Protocol was ratified by the Government of India in August 2002 and one of India's objectives in acceding to the Protocol was to fulfil the prerequisites for implementation of projects under the CDM in accordance with national sustainable priorities. The objective of this paper is to assess the effectiveness of using large-scale forestry projects under the CDM in achieving its twin goals using Karnataka State as a case study. The Generalized Comprehensive Mitigation Assessment Process (GCOMAP) Model is used to observe the effect of varying carbon prices on the land available for A/R projects. The model is coupled with outputs from the Lund-Potsdam-Jena (LPJ) Dynamic Global Vegetation Model to incorporate the impacts of temperature rise due to climate change under the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) A2, A1B and B1. With rising temperatures and CO2, vegetation productivity is increased under A2 and A1B scenarios and reduced under B1. Results indicate that higher carbon price paths produce higher gains in carbon credits and accelerate the rate at which available land hits maximum capacity thus acting as either an incentive or disincentive for landowners to commit their lands to forestry mitigation projects. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Governance has been one of the most popular buzzwords in recent political science. As with any term shared by numerous fields of research, as well as everyday language, governance is encumbered by a jungle of definitions and applications. This work elaborates on the concept of network governance. Network governance refers to complex policy-making situations, where a variety of public and private actors collaborate in order to produce and define policy. Governance is processes of autonomous, self-organizing networks of organizations exchanging information and deliberating. Network governance is a theoretical concept that corresponds to an empirical phenomenon. Often, this phenomenon is used to descirbe a historical development: governance is often used to describe changes in political processes of Western societies since the 1980s. In this work, empirical governance networks are used as an organizing framework, and the concepts of autonomy, self-organization and network structure are developed as tools for empirical analysis of any complex decision-making process. This work develops this framework and explores the governance networks in the case of environmental policy-making in the City of Helsinki, Finland. The crafting of a local ecological sustainability programme required support and knowledge from all sectors of administration, a number of entrepreneurs and companies and the inhabitants of Helsinki. The policy process relied explicitly on networking, with public and private actors collaborating to design policy instruments. Communication between individual organizations led to the development of network structures and patterns. This research analyses these patterns and their effects on policy choice, by applying the methods of social network analysis. A variety of social network analysis methods are used to uncover different features of the networked process. Links between individual network positions, network subgroup structures and macro-level network patterns are compared to the types of organizations involved and final policy instruments chosen. By using governance concepts to depict a policy process, the work aims to assess whether they contribute to models of policy-making. The conclusion is that the governance literature sheds light on events that would otherwise go unnoticed, or whose conceptualization would remain atheoretical. The framework of network governance should be in the toolkit of the policy analyst.
Resumo:
This article reports on a cross-sectional case study of a large construction project in which Electronic document management (EDM) was used. Attitudes towards EDM from the perspective of individual end users were investigated. Responses from a survey were combined with data from system usage log files to obtain an overview of attitudes prevalent in different user segments of the total population of 334 users. The survey was followed by semi-structured interviews with representative users. A strong majority of users from all segments of the project group considered EDM as a valuable aid in their work processes, despite certain functional limitations of the system used and the complexity of the information mass. Based on the study a model describing the key factors affecting end user EDM adoption is proposed. The model draws on insight from earlier studies of EDM enabled projects and theoretical frameworks on technology acceptance and success of information systems, as well as the insights gained from the case study.
Resumo:
Triggered by the very quick proliferation of Internet connectivity, electronic document management (EDM) systems are now rapidly being adopted for managing the documentation that is produced and exchanged in construction projects. Nevertheless there are still substantial barriers to the efficient use of such systems, mainly of a psychological nature and related to insufficient training. This paper presents the results of empirical studies carried out during 2002 concerning the current usage of EDM systems in the Finnish construction industry. The studies employed three different methods in order to provide a multifaceted view of the problem area, both on the industry and individual project level. In order to provide an accurate measurement of overall usage volume in the industry as a whole telephone interviews with key personnel from 100 randomly chosen construction projects were conducted. The interviews showed that while around 1/3 of big projects already have adopted the use of EDM, very few small projects have adopted this technology. The barriers to introduction were investigated through interviews with representatives for half a dozen of providers of systems and ASP-services. These interviews shed a lot of light on the dynamics of the market for this type of services and illustrated the diversity of business strategies adopted by vendors. In the final study log files from a project which had used an EDM system were analysed in order to determine usage patterns. The results illustrated that use is yet incomplete in coverage and that only a part of the individuals involved in the project used the system efficiently, either as information producers or consumers. The study also provided feedback on the usefulness of the log files.
Resumo:
Open access is a new model for the publishing of scientific journals enabled by the Internet, in which the published articles are freely available for anyone to read. During the 1990’s hundreds of individual open access journals were founded by groups of academics, supported by grants and unpaid voluntary work. During the last five years other types of open access journals, funded by author charges have started to emerge and also established publishers have started to experiment with different variations of open access. This article reports on the experiences of one open access journal (The Electronic Journal of Information Technology in Construction, ITcon) over its ten year history. In addition to a straightforward account of the lessons learned the journal is also benchmarked against a number of competitors in the same research area and its development is put into the larger perspective of changes in scholarly publishing. The main findings are: That a journal publishing around 20-30 articles per year, equivalent to a typical quarterly journal, can sustainable be produced using an open source like production model. The journal outperforms its competitors in some respects, such as the speed of publication, availability of the results and balanced global distribution of authorship, and is on a par with them in most other respects. The key statistics for ITcon are: Acceptance rate 55 %. Average speed of publication 6-7 months. 801 subscribers to email alerts. Average number of downloads by human readers per paper per month 21.
Resumo:
The World Wide Web provides the opportunity for a radically changed and much more efficient communication process for scientific results. A survey in the closely related domains of construction information technology and construction management was conducted in February 2000, aimed at measuring to what extent these opportunities are already changing the scientific information exchange and how researchers feel about the changes. The paper presents the results based on 236 replies to an extensive Web based questionnaire. 65% of the respondents stated their primary research interest as IT in A/E/C and 20% as construction management and economics. The questions dealt with how researchers find, access and read different sources; how much and what publications they read; how often and to which conferences they travel; how much they publish, and what are the criteria for where they eventually decide to publish. Some of the questions confronted traditional and electronic publishing with one final section dedicated to opinions about electronic publishing. According to the survey researchers already download half of the material that they read digitally from the Web. The most popular method for retrieving an interesting publication is downloading it for free from the author’s or publisher’s website. Researchers are not particularly willing to pay for electronic scientific publications. There is much support for a scenario of electronic journals available totally freely on the Web, where the costs could be covered by for instance professional societies or the publishing university. The shift that the Web is causing seems to be towards the "just in time" reading of literature. Also, frequent users of the Web rely less on scientific publications and tend to read fewer articles. If available with little effort, papers published in traditional journals are preferred; if not, the papers should be on the Web. In these circumstances, the role of paper-based journals published by established publishers is shifting from the core "information exchange" to the building of authors' prestige. The respondents feel they should build up their reputations by publishing in journals and relevant conferences, but then make their work freely available on the Web.
Resumo:
n many parts of the world, the goal of electricity supply industries is always the introduction of competition and a lowering of the average consumer price. Because of this it has become much more important to be able to determine which generators are supplying a particular load, how much use each generator is making of a transmission line and what is generator's contribution to the system losses. In this paper a case study on generator contributions towards loads and transmission flows are illustrated with an equivalent 11-bus system, a part of Indian Southern Grid, based on the concepts of circuit flow directions, for normal and network contingency conditions.