141 resultados para use value


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past 20 years the nature of rural valuation practice has required most rural valuers to undertake studies in both agriculture (farm management) and valuation, especially if carrying out valuation work for financial institutions. The additional farm financial and management information obtained by rural valuers exceeds that level of information required to value commercial, retail and industrial by the capitalisation of net rent/profit valuation method and is very similar to the level of information required for the valuation of commercial and retail property by the Discounted Cash Flow valuation method. On this basis the valuers specialising in rural valuation practice have the necessary skills and information to value rural properties by an income valuation method, which can focus on the long term environmental and economic sustainability of the property being valued. This paper will review the results of an extensive survey carried out by rural property valuers in Australia, in relation to the impact of farm management on rural property values and sustainable rural land use. A particular focus of the research relates to the increased awareness of the problems of rural land degradation in Australia and the subsequent impact such problems have on the productivity of rural land. These problems of sustainable land use have resulted in the need to develop an approach to rural valuation practice that allows the valuer to factor the past management practices on the subject rural property into the actual valuation figure. An analysis of the past farm management and the inclusion of this data into the valuation methodology provides a much more reliable indication of farm sustainable economic value than the existing direct comparison valuation methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high level of scholarly writing required for a doctoral thesis is a challenge for many research students. However, formal academic writing training is not a core component of many doctoral programs. Informal writing groups for doctoral students may be one method of contributing to the improvement of scholarly writing. In this paper, we report on a writing group that was initiated by an experienced writer and higher degree research supervisor to support and improve her doctoral students’ writing capabilities. Over time, this group developed a workable model to suit their varying needs and circumstances. The model comprised group sessions, an email group, and individual writing. Here, we use a narrative approach to explore the effectiveness and value of our research writing group model in improving scholarly writing. The data consisted of doctoral students’ reflections to stimulus questions about their writing progress and experiences. The stimulus questions sought to probe individual concerns about their own writing, what they had learned in the research writing group, the benefits of the group, and the disadvantages and challenges to participation. These reflections were analysed using thematic analysis. Following this analysis, the supervisor provided her perspective on the key themes that emerged. Results revealed that, through the writing group, members learned technical elements (e.g., paragraph structure), non-technical elements (e.g., working within limited timeframes), conceptual elements (e.g., constructing a cohesive arguments), collaborative writing processes, and how to edit and respond to feedback. In addition to improved writing quality, other benefits were opportunities for shared writing experiences, peer support, and increased confidence and motivation. The writing group provides a unique social learning environment with opportunities for: professional dialogue about writing, peer learning and review, and developing a supportive peer network. Thus our research writing group has proved an effective avenue for building doctoral students’ capability in scholarly writing. The proposed model for a research writing group could be applicable to any context, regardless of the type and location of the university, university faculty, doctoral program structure, or number of postgraduate students. It could also be used within a group of students with diverse research abilities, needs, topics and methodologies. However, it requires a group facilitator with sufficient expertise in scholarly writing and experience in doctoral supervision who can both engage the group in planned writing activities and also capitalise on fruitful lines of discussion related to students’ concerns as they arise. The research writing group is not intended to replace traditional supervision processes nor existing training. However it has clear benefits for improving scholarly writing in doctoral research programs particularly in an era of rapidly increasing student load.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to examine the use of bid information, including both price and non-price factors in predicting the bidder’s performance. Design/methodology/approach – The practice of the industry was first reviewed. Data on bid evaluation and performance records of the successful bids were then obtained from the Hong Kong Housing Department, the largest housing provider in Hong Kong. This was followed by the development of a radial basis function (RBF) neural network based performance prediction model. Findings – It is found that public clients are more conscientious and include non-price factors in their bid evaluation equations. With the input variables used the information is available at the time of the bid and the output variable is the project performance score recorded during work in progress achieved by the successful bidder. It was found that past project performance score is the most sensitive input variable in predicting future performance. Research limitations/implications – The paper shows the inadequacy of using price alone for bid award criterion. The need for a systemic performance evaluation is also highlighted, as this information is highly instrumental for subsequent bid evaluations. The caveat for this study is that the prediction model was developed based on data obtained from one single source. Originality/value – The value of the paper is in the use of an RBF neural network as the prediction tool because it can model non-linear function. This capability avoids tedious ‘‘trial and error’’ in deciding the number of hidden layers to be used in the network model. Keywords Hong Kong, Construction industry, Neural nets, Modelling, Bid offer spreads Paper type Research paper

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose –The introduction of Building Information Model tools over the last 20 years is resulting in radical changes in the Architectural, Engineering and Construction industry. One of these changes concerns the use of Virtual Prototyping - an advanced technology integrating BIM with realistic graphical simulations. Construction Virtual Prototyping (CVP) has now been developed and implemented on ten real construction projects in Hong Kong in the past three years. This paper reports on a survey aimed at establishing the effects of adopting this new technology and obtaining recommendations for future development. Design/methodology/approach – A questionnaire survey was conducted in 2007 of 28 key participants involved in four major Hong Kong construction projects – these projects being chosen because the CVP approach was used in more than one stage in each project. In addition, several interviews were conducted with the project manager, planning manager and project engineer of an individual project. Findings –All the respondents and interviewees gave a positive response to the CVP approach, with the most useful software functions considered to be those relating to visualisation and communication. The CVP approach was thought to improve the collaboration efficiency of the main contractor and sub-contractors by approximately 30 percent, and with a concomitant 30 to 50 percent reduction in meeting time. The most important benefits of CPV in the construction planning stage are the improved accuracy of process planning and shorter planning times, while improved fieldwork instruction and reducing rework occur in the construction implementation stage. Although project teams are hesitant to attribute the use of CVP directly to any specific time savings, it was also acknowledged that the workload of project planners is decreased. Suggestions for further development of the approach include incorporation of automatic scheduling and advanced assembly study. Originality/value –Whilst the research, development and implementation of CVP is relatively new in the construction industry, it is clear from the applications and feedback to date that the approach provides considerable added value to the organisation and management of construction projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To summarise the extent to which narrative text fields in administrative health data are used to gather information about the event resulting in presentation to a health care provider for treatment of an injury, and to highlight best practise approaches to conducting narrative text interrogation for injury surveillance purposes.----- Design: Systematic review----- Data sources: Electronic databases searched included CINAHL, Google Scholar, Medline, Proquest, PubMed and PubMed Central.. Snowballing strategies were employed by searching the bibliographies of retrieved references to identify relevant associated articles.----- Selection criteria: Papers were selected if the study used a health-related database and if the study objectives were to a) use text field to identify injury cases or use text fields to extract additional information on injury circumstances not available from coded data or b) use text fields to assess accuracy of coded data fields for injury-related cases or c) describe methods/approaches for extracting injury information from text fields.----- Methods: The papers identified through the search were independently screened by two authors for inclusion, resulting in 41 papers selected for review. Due to heterogeneity between studies metaanalysis was not performed.----- Results: The majority of papers reviewed focused on describing injury epidemiology trends using coded data and text fields to supplement coded data (28 papers), with these studies demonstrating the value of text data for providing more specific information beyond what had been coded to enable case selection or provide circumstantial information. Caveats were expressed in terms of the consistency and completeness of recording of text information resulting in underestimates when using these data. Four coding validation papers were reviewed with these studies showing the utility of text data for validating and checking the accuracy of coded data. Seven studies (9 papers) described methods for interrogating injury text fields for systematic extraction of information, with a combination of manual and semi-automated methods used to refine and develop algorithms for extraction and classification of coded data from text. Quality assurance approaches to assessing the robustness of the methods for extracting text data was only discussed in 8 of the epidemiology papers, and 1 of the coding validation papers. All of the text interrogation methodology papers described systematic approaches to ensuring the quality of the approach.----- Conclusions: Manual review and coding approaches, text search methods, and statistical tools have been utilised to extract data from narrative text and translate it into useable, detailed injury event information. These techniques can and have been applied to administrative datasets to identify specific injury types and add value to previously coded injury datasets. Only a few studies thoroughly described the methods which were used for text mining and less than half of the studies which were reviewed used/described quality assurance methods for ensuring the robustness of the approach. New techniques utilising semi-automated computerised approaches and Bayesian/clustering statistical methods offer the potential to further develop and standardise the analysis of narrative text for injury surveillance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Broadly speaking, axiology is the study of values. Axiologies are expressed materially in patterns of choices that are both culture-bound and definitive of different cultures. They are expressed in the language we use; in the friends we keep; in the clothes we wear; in what we read, write, and watch; in the technologies we use; in the gods we believe in and pray to; in the music we make and listen to—indeed, in every kind of activity that can be counted as a definitive element of culture. In what follows, I describe the axiological underpinnings of two closely related multimedia repository projects— Australian Creative Resources Online (ACRO) and The Canadian Centre for Cultural Innovation (CCCI)—and how these are oriented towards a potentially liberating role for digital repositories.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper seeks to identify the sources of value in a government health screening service. Consumers' use of such services for their won benefits demonstrates desirable behaviour and their continued use of these services indicates maintenance of the behaviour. There are also positive outcomes for society as the health of its members is improved overall through this behaviour. Individual-depth interview with 25 women who use breast cancer screening services provided by BreastScreen (BSQ) revealed five categories of sources of value. They are information sources, interaction sources, service, environment, and consumer participation. These findings provide valuable insights into the value construction of consumers and contribute towards our understanding of the value concept in social marketing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The value of soil evidence in the forensic discipline is well known. However, it would be advantageous if an in-situ method was available that could record responses from tyre or shoe impressions in ground soil at the crime scene. The development of optical fibres and emerging portable NIR instruments has unveiled a potential methodology which could permit such a proposal. The NIR spectral region contains rich chemical information in the form of overtone and combination bands of the fundamental infrared absorptions and low-energy electronic transitions. This region has in the past, been perceived as being too complex for interpretation and consequently was scarcely utilized. The application of NIR in the forensic discipline is virtually non-existent creating a vacancy for research in this area. NIR spectroscopy has great potential in the forensic discipline as it is simple, nondestructive and capable of rapidly providing information relating to chemical composition. The objective of this study is to investigate the ability of NIR spectroscopy combined with Chemometrics to discriminate between individual soils. A further objective is to apply the NIR process to a simulated forensic scenario where soil transfer occurs. NIR spectra were recorded from twenty-seven soils sampled from the Logan region in South-East Queensland, Australia. A series of three high quartz soils were mixed with three different kaolinites in varying ratios and NIR spectra collected. Spectra were also collected from six soils as the temperature of the soils was ramped from room temperature up to 6000C. Finally, a forensic scenario was simulated where the transferral of ground soil to shoe soles was investigated. Chemometrics methods such as the commonly known Principal Component Analysis (PCA), the less well known fuzzy clustering (FC) and ranking by means of multicriteria decision making (MCDM) methodology were employed to interpret the spectral results. All soils were characterised using Inductively Coupled Plasma Optical Emission Spectroscopy and X-Ray Diffractometry. Results were promising revealing NIR combined with Chemometrics is capable of discriminating between the various soils. Peak assignments were established by comparing the spectra of known minerals with the spectra collected from the soil samples. The temperature dependent NIR analysis confirmed the assignments of the absorptions due to adsorbed and molecular bound water. The relative intensities of the identified NIR absorptions reflected the quantitative XRD and ICP characterisation results. PCA and FC analysis of the raw soils in the initial NIR investigation revealed that the soils were primarily distinguished on the basis of their relative quartz and kaolinte contents, and to a lesser extent on the horizon from which they originated. Furthermore, PCA could distinguish between the three kaolinites used in the study, suggesting that the NIR spectral region was sensitive enough to contain information describing variation within kaolinite itself. The forensic scenario simulation PCA successfully discriminated between the ‘Backyard Soil’ and ‘Melcann® Sand’, as well as the two sampling methods employed. Further PCA exploration revealed that it was possible to distinguish between the various shoes used in the simulation. In addition, it was possible to establish association between specific sampling sites on the shoe with the corresponding site remaining in the impression. The forensic application revealed some limitations of the process relating to moisture content and homogeneity of the soil. These limitations can both be overcome by simple sampling practices and maintaining the original integrity of the soil. The results from the forensic scenario simulation proved that the concept shows great promise in the forensic discipline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years the air transport industry has experienced unprecedented growth, driven by strong local and global economies. Whether this growth can continue in the face of anticipated oil crises; international economic forecasts and recent influenza outbreaks is yet to be seen. One thing is certain, airport owners and operators will continue to be faced with challenging environments in which to do business. In response, many airports recognize the value in diversifying their revenue streams through a variety of landside property developments within the airport boundary. In Australia it is the type and intended market of this development that is a point of contention between private airport corporations and their surrounding municipalities. The aim of this preliminary research is to identify and categorize on-airport development occurring at the twenty-two privatized Australian airports which are administered under the Airports Act [1996]. This new knowledge will assist airport and municipal planners in understanding the current extent and category of on-airport land use, allowing them to make better decisions when proposing development both within airport master plans and beyond the airport boundary in local town and municipal plans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most online assessment systems now incorporate social networking features, and recent developments in social media spaces include protocols that allow the synchronisation and aggregation of data across multiple user profiles. In light of these advances and the concomitant fear of data sharing in secondary school education this papers provides important research findings about generic features of online social networking, which educators can use to make sound and efficient assessments in collaboration with their students and colleagues. This paper reports on a design experiment in flexible educational settings that challenges the dichotomous legacy of success and failure evident in many assessment activities for at-risk youth. Combining social networking practices with the sociology of education the paper proposes that assessment activities are best understood as a negotiable field of exchange. In this design experiment students, peers and educators engage in explicit, "front-end" assessment (Wyatt-Smith, 2008) to translate digital artefacts into institutional, and potentiality economic capital without continually referring to paper based pre-set criteria. This approach invites students and educators to use social networking functions to assess “work in progress” and final submissions in collaboration, and in doing so assessors refine their evaluative expertise and negotiate the value of student’s work from which new criteria can emerge. The mobile advantages of web-based technologies aggregate, externalise and democratise this transparent assessment model for most, if not all, student work that can be digitally represented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report documents Stage Two of the Australian ePortfolio Project (AeP2), to specifically explore the current scope of national and international ePortfolio communities of practice in order to identify the factors that have contributed to their success and sustainability. The study has built on Stage One of the Australian ePortfolio Project (Hallam, Harper, McCowan, Hauville, McAllister, & Creagh, 2008), which outlined the broad range of issues and challenges, as well as significant opportunities, that faced the higher education sector in terms of ePortfolio practice, to determine how the emergent community of ePortfolio researchers and practitioners in Australia might be advanced. ---------- The overarching aims of this project were to focus on building the Australian community of practice through an online forum and further symposium activities. Through the research activities the project sought to generate the following major outcomes: develop a forum within the ALTC Exchange to support an ePortfolio community of practice; develop strategies to encourage interest in and engagement with community of practice activities; develop and promote resources to support the diverse stakeholders in ePortfolio practice; collaborate in the establishment of a cross-sector ePortfolio community of practice; host a second Australian ePortfolio Symposium (AeP2) to disseminate the findings from the Australian ePortfolio Project, to explore innovative practice in ePortfolio use in higher education, to articulate policy developments, and to stimulate discussion on international ePortfolio issues; host an associated trade display as a forum for strengthening the higher education sector’s understanding of the features and functionality of ePortfolio platforms; develop resources to support an ePortfolio symposium model that may be adopted for future events. ----------- The project activities encompassed a survey of stakeholders, a program of semi-structured interviews with community managers and a series of case studies depicting successful ePortfolio communities. The survey of ePortfolio practitioners sought to determine the potential value of an ePortfolio CoP, the preferred focus for and the desired features of such a community, as well as the options for the technical and social architecture of an online forum. Through the semi-structured interviews it was possible to examine current examples of CoP activity, to identify the critical success factors and the challenges faced by individual ePortfolio CoPs, so that the attributes of good practice could be presented. The data collected in the interviews contributed to the development of 14 case studies, which have been beneficial in illustrating the diverse nature of CoPs in Australia and overseas.----------- The report presents a rich picture of national and international ePortfolio communities of practice, with an examination of the factors that have contributed to their success and sustainability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since its launch in 2001, the Creative Commons open content licensing initiative has received both praise and censure. While some have touted it as a major step towards removing the burdens copyright law imposes on creativity and innovation in the digital age, others have argued that it robs artists of their rightful income. This paper aims to provide a brief overview and analysis of the practical application of the Creative Commons licences five years after their launch. It looks at how the Creative Commons licences are being used and who is using them, and attempts to identify likely motivations for doing so. By identifying trends in how this licence use has changed over time, it also attempts to rebut arguments that Creative Commons is a movement of academics and hobbyists, and has no value for traditional organisations or working artists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high morbidity and mortality associated with atherosclerotic coronary vascular disease (CVD) and its complications are being lessened by the increased knowledge of risk factors, effective preventative measures and proven therapeutic interventions. However, significant CVD morbidity remains and sudden cardiac death continues to be a presenting feature for some subsequently diagnosed with CVD. Coronary vascular disease is also the leading cause of anaesthesia related complications. Stress electrocardiography/exercise testing is predictive of 10 year risk of CVD events and the cardiovascular variables used to score this test are monitored peri-operatively. Similar physiological time-series datasets are being subjected to data mining methods for the prediction of medical diagnoses and outcomes. This study aims to find predictors of CVD using anaesthesia time-series data and patient risk factor data. Several pre-processing and predictive data mining methods are applied to this data. Physiological time-series data related to anaesthetic procedures are subjected to pre-processing methods for removal of outliers, calculation of moving averages as well as data summarisation and data abstraction methods. Feature selection methods of both wrapper and filter types are applied to derived physiological time-series variable sets alone and to the same variables combined with risk factor variables. The ability of these methods to identify subsets of highly correlated but non-redundant variables is assessed. The major dataset is derived from the entire anaesthesia population and subsets of this population are considered to be at increased anaesthesia risk based on their need for more intensive monitoring (invasive haemodynamic monitoring and additional ECG leads). Because of the unbalanced class distribution in the data, majority class under-sampling and Kappa statistic together with misclassification rate and area under the ROC curve (AUC) are used for evaluation of models generated using different prediction algorithms. The performance based on models derived from feature reduced datasets reveal the filter method, Cfs subset evaluation, to be most consistently effective although Consistency derived subsets tended to slightly increased accuracy but markedly increased complexity. The use of misclassification rate (MR) for model performance evaluation is influenced by class distribution. This could be eliminated by consideration of the AUC or Kappa statistic as well by evaluation of subsets with under-sampled majority class. The noise and outlier removal pre-processing methods produced models with MR ranging from 10.69 to 12.62 with the lowest value being for data from which both outliers and noise were removed (MR 10.69). For the raw time-series dataset, MR is 12.34. Feature selection results in reduction in MR to 9.8 to 10.16 with time segmented summary data (dataset F) MR being 9.8 and raw time-series summary data (dataset A) being 9.92. However, for all time-series only based datasets, the complexity is high. For most pre-processing methods, Cfs could identify a subset of correlated and non-redundant variables from the time-series alone datasets but models derived from these subsets are of one leaf only. MR values are consistent with class distribution in the subset folds evaluated in the n-cross validation method. For models based on Cfs selected time-series derived and risk factor (RF) variables, the MR ranges from 8.83 to 10.36 with dataset RF_A (raw time-series data and RF) being 8.85 and dataset RF_F (time segmented time-series variables and RF) being 9.09. The models based on counts of outliers and counts of data points outside normal range (Dataset RF_E) and derived variables based on time series transformed using Symbolic Aggregate Approximation (SAX) with associated time-series pattern cluster membership (Dataset RF_ G) perform the least well with MR of 10.25 and 10.36 respectively. For coronary vascular disease prediction, nearest neighbour (NNge) and the support vector machine based method, SMO, have the highest MR of 10.1 and 10.28 while logistic regression (LR) and the decision tree (DT) method, J48, have MR of 8.85 and 9.0 respectively. DT rules are most comprehensible and clinically relevant. The predictive accuracy increase achieved by addition of risk factor variables to time-series variable based models is significant. The addition of time-series derived variables to models based on risk factor variables alone is associated with a trend to improved performance. Data mining of feature reduced, anaesthesia time-series variables together with risk factor variables can produce compact and moderately accurate models able to predict coronary vascular disease. Decision tree analysis of time-series data combined with risk factor variables yields rules which are more accurate than models based on time-series data alone. The limited additional value provided by electrocardiographic variables when compared to use of risk factors alone is similar to recent suggestions that exercise electrocardiography (exECG) under standardised conditions has limited additional diagnostic value over risk factor analysis and symptom pattern. The effect of the pre-processing used in this study had limited effect when time-series variables and risk factor variables are used as model input. In the absence of risk factor input, the use of time-series variables after outlier removal and time series variables based on physiological variable values’ being outside the accepted normal range is associated with some improvement in model performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shell structures find use in many fields of engineering, notably structural, mechanical, aerospace and nuclear-reactor disciplines. Axisymmetric shell structures are used as dome type of roofs, hyperbolic cooling towers, silos for storage of grain, oil and industrial chemicals and water tanks. Despite their thin walls, strength is derived due to the curvature. The generally high strength-to-weight ratio of the shell form, combined with its inherent stiffness, has formed the basis of this vast application. With the advent in computation technology, the finite element method and optimisation techniques, structural engineers have extremely versatile tools for the optimum design of such structures. Optimisation of shell structures can result not only in improved designs, but also in a large saving of material. The finite element method being a general numerical procedure that could be used to treat any shell problem to any desired degree of accuracy, requires several runs in order to obtain a complete picture of the effect of one parameter on the shell structure. This redesign I re-analysis cycle has been achieved via structural optimisation in the present research, and MSC/NASTRAN (a commercially available finite element code) has been used in this context for volume optimisation of axisymmetric shell structures under axisymmetric and non-axisymmetric loading conditions. The parametric study of different axisymmetric shell structures has revealed that the hyperbolic shape is the most economical solution of shells of revolution. To establish this, axisymmetric loading; self-weight and hydrostatic pressure, and non-axisymmetric loading; wind pressure and earthquake dynamic forces have been modelled on graphical pre and post processor (PATRAN) and analysis has been performed on two finite element codes (ABAQUS and NASTRAN), numerical model verification studies are performed, and optimum material volume required in the walls of cylindrical, conical, parabolic and hyperbolic forms of axisymmetric shell structures are evaluated and reviewed. Free vibration and transient earthquake analysis of hyperbolic shells have been performed once it was established that hyperbolic shape is the most economical under all possible loading conditions. Effect of important parameters of hyperbolic shell structures; shell wall thickness, height and curvature, have been evaluated and empirical relationships have been developed to estimate an approximate value of the lowest (first) natural frequency of vibration. The outcome of this thesis has been the generation of new research information on performance characteristics of axisymmetric shell structures that will facilitate improved designs of shells with better choice of shapes and enhanced levels of economy and performance. Key words; Axisymmetric shell structures, Finite element analysis, Volume Optimisation_ Free vibration_ Transient response.