969 resultados para technology standard
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
Non Alcoholic Fatty Liver Disease (NAFLD) is a condition that is frequently seen but seldom investigated. Until recently, NAFLD was considered benign, self-limiting and unworthy of further investigation. This opinion is based on retrospective studies with relatively small numbers and scant follow-up of histology data. (1) The prevalence for adults, in the USA is, 30%, and NAFLD is recognized as a common and increasing form of liver disease in the paediatric population (1). Australian data, from New South Wales, suggests the prevalence of NAFLD in “healthy” 15 year olds as being 10%.(2) Non-alcoholic fatty liver disease is a condition where fat progressively invades the liver parenchyma. The degree of infiltration ranges from simple steatosis (fat only) to steatohepatitis (fat and inflammation) steatohepatitis plus fibrosis (fat, inflammation and fibrosis) to cirrhosis (replacement of liver texture by scarred, fibrotic and non functioning tissue).Non-alcoholic fatty liver is diagnosed by exclusion rather than inclusion. None of the currently available diagnostic techniques -liver biopsy, liver function tests (LFT) or Imaging; ultrasound, Computerised tomography (CT) or Magnetic Resonance Imaging (MRI) are specific for non-alcoholic fatty liver. An association exists between NAFLD, Non Alcoholic Steatosis Hepatitis (NASH) and irreversible liver damage, cirrhosis and hepatoma. However, a more pervasive aspect of NAFLD is the association with Metabolic Syndrome. This Syndrome is categorised by increased insulin resistance (IR) and NAFLD is thought to be the hepatic representation. Those with NAFLD have an increased risk of death (3) and it is an independent predictor of atherosclerosis and cardiovascular disease (1). Liver biopsy is considered the gold standard for diagnosis, (4), and grading and staging, of non-alcoholic fatty liver disease. Fatty-liver is diagnosed when there is macrovesicular steatosis with displacement of the nucleus to the edge of the cell and at least 5% of the hepatocytes are seen to contain fat (4).Steatosis represents fat accumulation in liver tissue without inflammation. However, it is only called non-alcoholic fatty liver disease when alcohol - >20gms-30gms per day (5), has been excluded from the diet. Both non-alcoholic and alcoholic fatty liver are identical on histology. (4).LFT’s are indicative, not diagnostic. They indicate that a condition may be present but they are unable to diagnosis what the condition is. When a patient presents with raised fasting blood glucose, low HDL (high density lipoprotein), and elevated fasting triacylglycerols they are likely to have NAFLD. (6) Of the imaging techniques MRI is the least variable and the most reproducible. With CT scanning liver fat content can be semi quantitatively estimated. With increasing hepatic steatosis, liver attenuation values decrease by 1.6 Hounsfield units for every milligram of triglyceride deposited per gram of liver tissue (7). Ultrasound permits early detection of fatty liver, often in the preclinical stages before symptoms are present and serum alterations occur. Earlier, accurate reporting of this condition will allow appropriate intervention resulting in better patient health outcomes. References 1. Chalasami N. Does fat alone cause significant liver disease: It remains unclear whether simple steatosis is truly benign. American Gastroenterological Association Perspectives, February/March 2008 www.gastro.org/wmspage.cfm?parm1=5097 Viewed 20th October, 2008 2. Booth, M. George, J.Denney-Wilson, E: The population prevalence of adverse concentrations with adiposity of liver tests among Australian adolescents. Journal of Paediatrics and Child Health.2008 November 3. Catalano, D, Trovato, GM, Martines, GF, Randazzo, M, Tonzuso, A. Bright liver, body composition and insulin resistance changes with nutritional intervention: a follow-up study .Liver Int.2008; February 1280-9 4. Choudhury, J, Sanysl, A. Clinical aspects of Fatty Liver Disease. Semin in Liver Dis. 2004:24 (4):349-62 5. Dionysus Study Group. Drinking factors as cofactors of risk for alcohol induced liver change. Gut. 1997; 41 845-50 6. Preiss, D, Sattar, N. Non-alcoholic fatty liver disease: an overview of prevalence, diagnosis, pathogenesis and treatment considerations. Clin Sci.2008; 115 141-50 7. American Gastroenterological Association. Technical review on nonalcoholic fatty liver disease. Gastroenterology.2002; 123: 1705-25
Resumo:
Vitamin D deficiency and insufficiency are now seen as a contemporary health problem in Australia with possible widespread health effects not limited to bone health1. Despite this, the Vitamin D status (measured as serum 25-hydroxyvitamin D (25(OH)D)) of ambulatory adults has been overlooked in this country. Serum 25(OH)D status is especially important among this group as studies have shown a link between Vitamin D and fall risk in older adults2. Limited data also exists on the contributions of sun exposure via ultraviolet radiation and dietary intake to serum 25(OH)D status in this population. The aims of this project were to assess the serum 25(OH)D status of a group of older ambulatory adults in South East Queensland, to assess the association between their serum 25(OH)D status and functional measures as possible indicators of fall risk, obtain data on the sources of Vitamin D in this population and assess whether this intake was related to serum 25(OH)D status and describe sun protection and exposure behaviors in this group and investigate whether a relationship existed between these and serum 25(OH)D status. The collection of this data assists in addressing key gaps identified in the literature with regard to this population group and their Vitamin D status in Australia. A representative convenience sample of participants (N=47) over 55 years of age was recruited for this cross-sectional, exploratory study which was undertaken in December 2007 in south-east Queensland (Brisbane and Sunshine coast). Participants were required to complete a sun exposure questionnaire in addition to a Calcium and Vitamin D food frequency questionnaire. Timed up and go and handgrip dynamometry tests were used to examine functional capacity. Serum 25(OH)D status and blood measures of Calcium, Phosphorus and Albumin were determined through blood tests. The Mean and Median serum 25-Hydroxyvitamin D (25(OH)D) for all participants in this study was 85.8nmol/L (Standard Deviation 29.7nmol/L) and 81.0nmol/L (Range 22-158nmol/L), respectively. Analysis at the bivariate level revealed a statistically significant relationship between serum 25(OH)D status and location, with participants living on the Sunshine Coast having a mean serum 25(OH)D status 21.3nmol/L higher than participants living in Brisbane (p=0.014). While at the descriptive level there was an apparent trend towards higher outdoor exposure and increasing levels of serum 25(OH)D, no statistically significant associations between the sun measures of outdoor exposure, sun protection behaviors and phenotypic characteristics and serum 25(OH)D status were observed. Intake of both Calcium and Vitamin D was low in this sample with sixty-eight (68%) of participants not meeting the Estimated Average Requirements (EAR) for Calcium (Median=771.0mg; Range=218.0-2616.0mg), while eighty-seven (87%) did not meet the Adequate Intake for Vitamin D (Median=4.46ug; Range=0.13-30.0ug). This raises the question of how realistic meeting the new Adequate Intakes for Vitamin D is, when there is such a low level of Vitamin D fortification in this country. However, participants meeting the Adequate Intake (AI) for Vitamin D were observed to have a significantly higher serum 25(OH)D status compared to those not meeting the AI for Vitamin D (p=0.036), showing that meeting the AI for Vitamin D may play a significant role in determining Vitamin D status in this population. By stratifying our data by categories of outdoor exposure time, a trend was observed between increased importance of Vitamin D dietary intake as a possible determinant of serum 25(OH)D status in participants with lower outdoor exposures. While a trend towards higher Timed Up and Go scores in participants with higher 25(OH) D status was seen, this was only significant for females (p=0.014). Handgrip strength showed statistically significant association with serum 25(OH)D status. The high serum 25(OH)D status in our sample almost certainly explains the limited relationship between functional measures and serum 25(OH)D. However, the observation of an association between slower Time Up and Go speeds, and lower serum 25(OH)D levels, even with a small sample size, is significant as slower Timed Up and Go speeds have been associated with increased fall risk in older adults3. Multivariable regression analysis revealed Location as the only significant determinant of serum 25(OH)D status at p=0.014, with trends (p=>0.1) for higher serum 25(OH)D being shown for participants that met the AI for Vitamin D and rated themselves as having a higher health status. The results of this exploratory study show that 93.6% of participants had adequate 25(OH)D status-possibly due to measurement being taken in the summer season and the convenience nature of the sample. However, many participants do not meet their dietary Calcium and Vitamin D requirements, which may indicate inadequate intake of these nutrients in older Australians and a higher risk of osteoporosis. The relationship between serum 25(OH)D and functional measures in this population also requires further study, especially in older adults displaying Vitamin D insufficiency or deficiency.
Resumo:
This paper presents research findings about the use of remote desktop applications to teach music sequencing software. It highlights the successes, shortcomings and interactive issues encountered during a pilot project with a theoretical focus on a specific interactive bottleneck. The paper proposes a new delivery and partnership model to widen this bottleneck, which currently hinders interactions between the technical support, education and professional development communities in music technology.
Resumo:
With the advent of Service Oriented Architecture, Web Services have gained tremendous popularity. Due to the availability of a large number of Web services, finding an appropriate Web service according to the requirement of the user is a challenge. This warrants the need to establish an effective and reliable process of Web service discovery. A considerable body of research has emerged to develop methods to improve the accuracy of Web service discovery to match the best service. The process of Web service discovery results in suggesting many individual services that partially fulfil the user’s interest. By considering the semantic relationships of words used in describing the services as well as the use of input and output parameters can lead to accurate Web service discovery. Appropriate linking of individual matched services should fully satisfy the requirements which the user is looking for. This research proposes to integrate a semantic model and a data mining technique to enhance the accuracy of Web service discovery. A novel three-phase Web service discovery methodology has been proposed. The first phase performs match-making to find semantically similar Web services for a user query. In order to perform semantic analysis on the content present in the Web service description language document, the support-based latent semantic kernel is constructed using an innovative concept of binning and merging on the large quantity of text documents covering diverse areas of domain of knowledge. The use of a generic latent semantic kernel constructed with a large number of terms helps to find the hidden meaning of the query terms which otherwise could not be found. Sometimes a single Web service is unable to fully satisfy the requirement of the user. In such cases, a composition of multiple inter-related Web services is presented to the user. The task of checking the possibility of linking multiple Web services is done in the second phase. Once the feasibility of linking Web services is checked, the objective is to provide the user with the best composition of Web services. In the link analysis phase, the Web services are modelled as nodes of a graph and an allpair shortest-path algorithm is applied to find the optimum path at the minimum cost for traversal. The third phase which is the system integration, integrates the results from the preceding two phases by using an original fusion algorithm in the fusion engine. Finally, the recommendation engine which is an integral part of the system integration phase makes the final recommendations including individual and composite Web services to the user. In order to evaluate the performance of the proposed method, extensive experimentation has been performed. Results of the proposed support-based semantic kernel method of Web service discovery are compared with the results of the standard keyword-based information-retrieval method and a clustering-based machine-learning method of Web service discovery. The proposed method outperforms both information-retrieval and machine-learning based methods. Experimental results and statistical analysis also show that the best Web services compositions are obtained by considering 10 to 15 Web services that are found in phase-I for linking. Empirical results also ascertain that the fusion engine boosts the accuracy of Web service discovery by combining the inputs from both the semantic analysis (phase-I) and the link analysis (phase-II) in a systematic fashion. Overall, the accuracy of Web service discovery with the proposed method shows a significant improvement over traditional discovery methods.
Resumo:
In this article I outline and demonstrate a synthesis of the methods developed by Lemke (1998) and Martin (2000) for analyzing evaluations in English. I demonstrate the synthesis using examples from a 1.3-million-word technology policy corpus drawn from institutions at the local, state, national, and supranational levels. Lemke's (1998) critical model is organized around the broad 'evaluative dimensions' that are deployed to evaluate propositions and proposals in English. Martin's (2000) model is organized with a more overtly systemic-functional orientation around the concept of 'encoded feeling'. In applying both these models at different times, whilst recognizing their individual usefulness and complementarity, I found specific limitations that led me to work towards a synthesis of the two approaches. I also argue for the need to consider genre, media, and institutional aspects more explicitly when claiming intertextual and heteroglossic relations as the basis for inferred evaluations. A basic assertion made in this article is that the perceived Desirability of a process, person, circumstance, or thing is identical to its 'value'. But the Desirability of anything is a socially and thus historically conditioned attribution that requires significant amounts of institutional inculcation of other 'types' of value-appropriateness, importance, beauty, power, and so on. I therefore propose a method informed by critical discourse analysis (CDA) that sees evaluation as happening on at least four interdependent levels of abstraction.
Resumo:
In architecture courses, instilling a wider understanding of the industry specific representations practiced in the Building Industry is normally done under the auspices of Technology and Science subjects. Traditionally, building industry professionals communicated their design intentions using industry specific representations. Originally these mainly two dimensional representations such as plans, sections, elevations, schedules, etc. were produced manually, using a drawing board. Currently, this manual process has been digitised in the form of Computer Aided Design and Drafting (CADD) or ubiquitously simply CAD. While CAD has significant productivity and accuracy advantages over the earlier manual method, it still only produces industry specific representations of the design intent. Essentially, CAD is a digital version of the drawing board. The tool used for the production of these representations in industry is still mainly CAD. This is also the approach taken in most traditional university courses and mirrors the reality of the situation in the building industry. A successor to CAD, in the form of Building Information Modelling (BIM), is presently evolving in the Construction Industry. CAD is mostly a technical tool that conforms to existing industry practices. BIM on the other hand is revolutionary both as a technical tool and as an industry practice. Rather than producing representations of design intent, BIM produces an exact Virtual Prototype of any building that in an ideal situation is centrally stored and freely exchanged between the project team. Essentially, BIM builds any building twice: once in the virtual world, where any faults are resolved, and finally, in the real world. There is, however, no established model for learning through the use of this technology in Architecture courses. Queensland University of Technology (QUT), a tertiary institution that maintains close links with industry, recognises the importance of equipping their graduates with skills that are relevant to industry. BIM skills are currently in increasing demand throughout the construction industry through the evolution of construction industry practices. As such, during the second half of 2008, QUT 4th year architectural students were formally introduced for the first time to BIM, as both a technology and as an industry practice. This paper will outline the teaching team’s experiences and methodologies in offering a BIM unit (Architectural Technology and Science IV) at QUT for the first time and provide a description of the learning model. The paper will present the results of a survey on the learners’ perspectives of both BIM and their learning experiences as they learn about and through this technology.
Resumo:
Organisations are increasingly investing in complex technological innovations such as enterprise information systems with the aim of improving the operations of the business, and in this way gaining competitive advantage. However, the implementation of technological innovations tends to have an excessive focus on either technology innovation effectiveness (also known as system effectiveness), or the resulting operational effectiveness; focusing on either one of them is detrimental to the long-term enterprise benefits through failure to achieve the real value of technological innovations. The lack of research on the dimensions and performance objectives that organisations must be focusing on is the main reason for this misalignment. This research uses a combination of qualitative and quantitative, three-stage methodological approach. Initial findings suggest that factors such as quality of information from technology innovation effectiveness, and quality and speed from operational effectiveness are important and significantly well correlated factors that promote the alignment between technology innovation effectiveness and operational effectiveness.
Resumo:
This chapter reports on Australian and Swedish experiences in the iterative design, development, and ongoing use of interactive educational systems we call ‘Media Maps.’ Like maps in general, Media Maps are usefully understood as complex cultural technologies; that is, they are not only physical objects, tools and artefacts, but also information creation and distribution technologies, the use and development of which are embedded in systems of knowledge and social meaning. Drawing upon Australian and Swedish experiences with one Media Map technology, this paper illustrates this three-layered approach to the development of media mapping. It shows how media mapping is being used to create authentic learning experiences for students preparing for work in the rapidly evolving media and communication industries. We also contextualise media mapping as a response to various challenges for curriculum and learning design in Media and Communication Studies that arise from shifts in tertiary education policy in a global knowledge economy.
Resumo:
Various reasons have been proffered for female under-representation in tertiary information technology (IT) courses and the IT industry with most relating to cultural moirés. The 2006 Geek Goddess calendar was designed to alter IT’s “geeky image” and the term is used here to represent young women enrolled in pre-service IT teaching courses. Their special mix of IT and teaching draws on conflicting stereotypes and represents a micro-climate which is typically lost in studies of IT occupations because of the aggregation of all IT roles. This paper will report on a small-scale investigation of female students (N=25) at a university in Queensland (Australia) studying to become teachers of secondary IT subjects. They are entering the IT industry, gendered as a “male” occupation, through the safe space of teaching a discipline allied to feminine qualities of nurturing. They are “geek goddesses” who – perhaps to balance the masculine and feminine of these occupations - have decided to go to school rather than into corporations or government.
Resumo:
Executive Summary The objective of this report was to use the Sydney Opera House as a case study of the application of Building Information Modelling (BIM). The Sydney opera House is a complex, large building with very irregular building configuration, that makes it a challenging test. A number of key concerns are evident at SOH: • the building structure is complex, and building service systems - already the major cost of ongoing maintenance - are undergoing technology change, with new computer based services becoming increasingly important. • the current “documentation” of the facility is comprised of several independent systems, some overlapping and is inadequate to service current and future services required • the building has reached a milestone age in terms of the condition and maintainability of key public areas and service systems, functionality of spaces and longer term strategic management. • many business functions such as space or event management require up-to-date information of the facility that are currently inadequately delivered, expensive and time consuming to update and deliver to customers. • major building upgrades are being planned that will put considerable strain on existing Facilities Portfolio services, and their capacity to manage them effectively While some of these concerns are unique to the House, many will be common to larger commercial and institutional portfolios. The work described here supported a complementary task which sought to identify if a building information model – an integrated building database – could be created, that would support asset & facility management functions (see Sydney Opera House – FM Exemplar Project, Report Number: 2005-001-C-4 Building Information Modelling for FM at Sydney Opera House), a business strategy that has been well demonstrated. The development of the BIMSS - Open Specification for BIM has been surprisingly straightforward. The lack of technical difficulties in converting the House’s existing conventions and standards to the new model based environment can be related to three key factors: • SOH Facilities Portfolio – the internal group responsible for asset and facility management - have already well established building and documentation policies in place. The setting and adherence to well thought out operational standards has been based on the need to create an environment that is understood by all users and that addresses the major business needs of the House. • The second factor is the nature of the IFC Model Specification used to define the BIM protocol. The IFC standard is based on building practice and nomenclature, widely used in the construction industries across the globe. For example the nomenclature of building parts – eg ifcWall, corresponds to our normal terminology, but extends the traditional drawing environment currently used for design and documentation. This demonstrates that the international IFC model accurately represents local practice for building data representation and management. • a BIM environment sets up opportunities for innovative processes that can exploit the rich data in the model and improve services and functions for the House: for example several high-level processes have been identified that could benefit from standardized Building Information Models such as maintenance processes using engineering data, business processes using scheduling, venue access, security data and benchmarking processes using building performance data. The new technology matches business needs for current and new services. The adoption of IFC compliant applications opens the way forward for shared building model collaboration and new processes, a significant new focus of the BIM standards. In summary, SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. These BIM standards and their application to the Opera House are intended as a template for other organisations to adopt for the own procurement and facility management activities. Appendices provide an overview of the IFC Integrated Object Model and an understanding IFC Model Data.
Resumo:
“SOH see significant benefit in digitising its drawings and operation and maintenance manuals. Since SOH do not currently have digital models of the Opera House structure or other components, there is an opportunity for this national case study to promote the application of Digital Facility Modelling using standardized Building Information Models (BIM)”. The digital modelling element of this project examined the potential of building information models for Facility Management focusing on the following areas: • The re-usability of building information for FM purposes • BIM as an Integrated information model for facility management • Extendibility of the BIM to cope with business specific requirements • Commercial facility management software using standardised building information models • The ability to add (organisation specific) intelligence to the model • A roadmap for SOH to adopt BIM for FM The project has established that BIM – building information modelling - is an appropriate and potentially beneficial technology for the storage of integrated building, maintenance and management data for SOH. Based on the attributes of a BIM, several advantages can be envisioned: consistency in the data, intelligence in the model, multiple representations, source of information for intelligent programs and intelligent queries. The IFC – open building exchange standard – specification provides comprehensive support for asset and facility management functions, and offers new management, collaboration and procurement relationships based on sharing of intelligent building data. The major advantages of using an open standard are: information can be read and manipulated by any compliant software, reduced user “lock in” to proprietary solutions, third party software can be the “best of breed” to suit the process and scope at hand, standardised BIM solutions consider the wider implications of information exchange outside the scope of any particular vendor, information can be archived as ASCII files for archival purposes, and data quality can be enhanced as the now single source of users’ information has improved accuracy, correctness, currency, completeness and relevance. SOH current building standards have been successfully drafted for a BIM environment and are confidently expected to be fully developed when BIM is adopted operationally by SOH. There have been remarkably few technical difficulties in converting the House’s existing conventions and standards to the new model based environment. This demonstrates that the IFC model represents world practice for building data representation and management (see Sydney Opera House – FM Exemplar Project Report Number 2005-001-C-3, Open Specification for BIM: Sydney Opera House Case Study). Availability of FM applications based on BIM is in its infancy but focussed systems are already in operation internationally and show excellent prospects for implementation systems at SOH. In addition to the generic benefits of standardised BIM described above, the following FM specific advantages can be expected from this new integrated facilities management environment: faster and more effective processes, controlled whole life costs and environmental data, better customer service, common operational picture for current and strategic planning, visual decision-making and a total ownership cost model. Tests with partial BIM data – provided by several of SOH’s current consultants – show that the creation of a SOH complete model is realistic, but subject to resolution of compliance and detailed functional support by participating software applications. The showcase has demonstrated successfully that IFC based exchange is possible with several common BIM based applications through the creation of a new partial model of the building. Data exchanged has been geometrically accurate (the SOH building structure represents some of the most complex building elements) and supports rich information describing the types of objects, with their properties and relationships.
Resumo:
This article reports on the impact on student personal creativity of a longitudinal study that had as its major goal the creation of a unique intervention program for elementary students. The intervention was based on the National Profile and Statement (Curriculum Corporation, 1994a, 1994b) for the curriculum area of technology. The intervention program comprised thematically based units of work that integrated all eight Australian Key Learning Areas (KLA). A pretest/posttest control group design investigation (Campbell & Stanley, 1963) was undertaken with 580 students from 7 schools and 24 class groups that were randomly divided into 3 treatment groups. One group (10 classes) formed the control group. Another 7 classes received the year-long intervention program, while the remaining 7 classes received the intervention, but with the added seamless integration of their available classroom computer technologies. The effect of the intervention on the personal creativity characteristics of the students involved in the study was assessed using the Creativity Checklist, an instrument that was developed during the study. The results suggest that the purposeful integration of computer technology with the intervention program positively affects the personal creativity characteristics of students.
Resumo:
Construction sector application of Lead Indicators generally and Positive Performance Indicators (PPIs) particularly, are largely seen by the sector as not providing generalizable indicators of safety effectiveness. Similarly, safety culture is often cited as an essential factor in improving safety performance, yet there is no known reliable way of measuring safety culture. This paper proposes that the accurate measurement of safety effectiveness and safety culture is a requirement for assessing safe behaviours, safety knowledge, effective communication and safety performance. Currently there are no standard national or international safety effectiveness indicators (SEIs) that are accepted by the construction industry. The challenge is that quantitative survey instruments developed for measuring safety culture and/ or safety climate are inherently flawed methodologically and do not produce reliable and representative data concerning attitudes to safety. Measures that combine quantitative and qualitative components are needed to provide a clear utility for safety effectiveness indicators.
Resumo:
The construction industry is categorised as being an information-intensive industry and described as one of the most important industries in any developed country, facing a period of rapid and unparalleled change (Industry Science Resources 1999) (Love P.E.D., Tucker S.N. et al. 1996). Project communications are becoming increasingly complex, with a growing need and fundamental drive to collaborate electronically at project level and beyond (Olesen K. and Myers M.D. 1999; Thorpe T. and Mead S. 2001; CITE 2003). Yet, the industry is also identified as having a considerable lack of knowledge and awareness about innovative information and communication technology (ICT) and web-based communication processes, systems and solutions which may prove beneficial in the procurement, delivery and life cycle of projects (NSW Government 1998; Kajewski S. and Weippert A. 2000). The Internet has debatably revolutionised the way in which information is stored, exchanged and viewed, opening new avenues for business, which only a decade ago were deemed almost inconceivable (DCITA 1998; IIB 2002). In an attempt to put these ‘new avenues of business’ into perspective, this report provides an overall ‘snapshot’ of current public and private construction industry sector opportunities and practices in the implementation and application of web-based ICT tools, systems and processes (e-Uptake). Research found that even with a reserved uptake, the construction industry and its participating organisations are making concerted efforts (fortunately with positive results) in taking up innovative forms of doing business via the internet, including e-Tendering (making it possible to manage the entire tender letting process electronically and online) (Anumba C.J. and Ruikar K. 2002; ITCBP 2003). Furthermore, Government (often a key client within the construction industry),and with its increased tendency to transact its business electronically, undoubtedly has an effect on how various private industry consultants, contractors, suppliers, etc. do business (Murray M. 2003) – by offering a wide range of (current and anticipated) e-facilities / services, including e-Tendering (Ecommerce 2002). Overall, doing business electronically is found to have a profound impact on the way today’s construction businesses operate - streamlining existing processes, with the growth in innovative tools, such as e-Tender, offering the construction industry new responsibilities and opportunities for all parties involved (ITCBP 2003). It is therefore important that these opportunities should be accessible to as many construction industry businesses as possible (The Construction Confederation 2001). Historically, there is a considerable exchange of information between various parties during a tendering process, where accuracy and efficiency of documentation is critical. Traditionally this process is either paper-based (involving large volumes of supporting tender documentation), or via a number of stand-alone, non-compatible computer systems, usually costly to both the client and contractor. As such, having a standard electronic exchange format that allows all parties involved in an electronic tender process to access one system only via the Internet, saves both time and money, eliminates transcription errors and increases speed of bid analysis (The Construction Confederation 2001). Supporting this research project’s aims and objectives, researchers set to determine today’s construction industry ‘current state-of-play’ in relation to e-Tendering opportunities. The report also provides brief introductions to several Australian and International e-Tender systems identified during this investigation. e-Tendering, in its simplest form, is described as the electronic publishing, communicating, accessing, receiving and submitting of all tender related information and documentation via the internet, thereby replacing the traditional paper-based tender processes, and achieving a more efficient and effective business process for all parties involved (NT Governement 2000; NT Government 2000; NSW Department of Commerce 2003; NSW Government 2003). Although most of the e-Tender websites investigated at the time, maintain their tendering processes and capabilities are ‘electronic’, research shows these ‘eTendering’ systems vary from being reasonably advanced to more ‘basic’ electronic tender notification and archiving services for various industry sectors. Research also indicates an e-Tender system should have a number of basic features and capabilities, including: • All tender documentation to be distributed via a secure web-based tender system – thereby avoiding the need for collating paperwork and couriers. • The client/purchaser should be able to upload a notice and/or invitation to tender onto the system. • Notification is sent out electronically (usually via email) for suppliers to download the information and return their responses electronically (online). • During the tender period, updates and queries are exchanged through the same e-Tender system. • The client/purchaser should only be able to access the tenders after the deadline has passed. • All tender related information is held in a central database, which should be easily searchable and fully audited, with all activities recorded. • It is essential that tender documents are not read or submitted by unauthorised parties. • Users of the e-Tender system are to be properly identified and registered via controlled access. In simple terms, security has to be as good as if not better than a manual tender process. Data is to be encrypted and users authenticated by means such as digital signatures, electronic certificates or smartcards. • All parties must be assured that no 'undetected' alterations can be made to any tender. • The tenderer should be able to amend the bid right up to the deadline – whilst the client/purchaser cannot obtain access until the submission deadline has passed. • The e-Tender system may also include features such as a database of service providers with spreadsheet-based pricing schedules, which can make it easier for a potential tenderer to electronically prepare and analyse a tender. Research indicates the efficiency of an e-Tender process is well supported internationally, with a significant number, yet similar, e-Tender benefits identified during this investigation. Both construction industry and Government participants generally agree that the implementation of an automated e-Tendering process or system enhances the overall quality, timeliness and cost-effectiveness of a tender process, and provides a more streamlined method of receiving, managing, and submitting tender documents than the traditional paper-based process. On the other hand, whilst there are undoubtedly many more barriers challenging the successful implementation and adoption of an e-Tendering system or process, researchers have also identified a range of challenges and perceptions that seem to hinder the uptake of this innovative approach to tendering electronically. A central concern seems to be that of security - when industry organisations have to use the Internet for electronic information transfer. As a result, when it comes to e-Tendering, industry participants insist these innovative tendering systems are developed to ensure the utmost security and integrity. Finally, if Australian organisations continue to explore the competitive ‘dynamics’ of the construction industry, without realising the current and future, trends and benefits of adopting innovative processes, such as e-Tendering, it will limit their globalising opportunities to expand into overseas markets and allow the continuation of international firms successfully entering local markets. As such, researchers believe increased knowledge, awareness and successful implementation of innovative systems and processes raises great expectations regarding their contribution towards ‘stimulating’ the globalisation of electronic procurement activities, and improving overall business and project performances throughout the construction industry sectors and overall marketplace (NSW Government 2002; Harty C. 2003; Murray M. 2003; Pietroforte R. 2003). Achieving the successful integration of an innovative e-Tender solution with an existing / traditional process can be a complex, and if not done correctly, could lead to failure (Bourn J. 2002).